Our solutions are transparent so that every data process is visible and audited to give you confidence in delivery
Our architecture is scalable from the very smallest of information systems to full enterprise solutions
We ensure that all our systems are built for ease of maintenance to reduce the cost and effort required in enhancing your solution
data services

Our data services revolve around the integration of quality trusted data with a view to the delivery of this data to many and varied information consumers. The following diagram represents our fully integrated approach to design that incorporates all aspects of data management from data profiling to master data management.

This architecture is not just theoretical, there are practical methods to implement and support it and not all aspects have to be undertaken at once. Each component has been carefully designed to operate as a single unit that can be implemented independently of the others, but extensible, so that it has seamless integration with all the other components. The architecture is also vendor neutral so can be adapted to the tools in your environment.

There are 3 core processes to this architecture namely:

  • Data Acquisition
  • Data Management (including data synchronisation)
  • Information Delivery

Data Acquisition is the process for extracting data from source systems and ensuring that it is quarantined and quality checked before delivering to the data management process.

Data Management is the process that integrates data from multiple disparate systems into an integrated repository that becomes the single source of the truth. It also incorporates data synchronisation which provides an operational feedback loop that will aid with the synchronisation of master data back to operational systems.

Information Delivery is the process of building business specific data models are produced focussed around making the delivery of data to information consumers as user friendly as possible. Information being produced goes beyond basic reporting and encompasses dashboards, external system feeds and ad-hoc querying environments.

Each of these is explored in more detail below.

DATA ACQUISITION

Data Acquisition is the process for extracting data from source systems and ensuring that it is quarantined and quality checked before delivering to the data management process. Undertaking this process will ensure that all data is extracted, understood and quality checked before loading into the data warehouse which will help reduce the maintenance overhead required for dealing with rogue data values.

Data Acquisition Process

The data acquisition process is comprised of two core data processes that are supported by two management processes. Firstly, data is extracted and landed from source systems, which is then profiled to better understand the contents of the data. Data profiling can be undertaking using specialised data profiling tools although a carefully crafted set of SQL queries can also do the job.

Once the profile has been undertaken, data cleansing rules can be created and used as part of the data quality firewall process to ensure that all data is quality checked before being processed. Data cleansing goes beyond just name and address cleansing to include correct formatting of values, verification of values, type conversions etc.

DATA MANAGEMENT

The data management process brings data together from multiple disparate systems into an integrated repository that becomes the single source of the truth. Before the data is loaded into this integrated repository it is important that it has undertaken the data acquisition process to ensure that the data has been transformed into a format conducive to integration and that the quality of the data has been confirmed.

Data Acquisition Process

Once the data has been loaded into integrated master data sets, it can be managed using a master data management tool that will allow fine tuning and enhancement of the data with information that is not readily available from the sources. As part of the master data management process, this data can be used for both analytical and operational purposes by either being fed downstream into the information delivery process or fed back to the source systems to aid with the enhancement of source system data.

The data marts are built using different perspectives of the same master data. This means that, for example, sales information can be viewed from the perspective of the sales person or from the perspective of the product being sold. The transaction information for a sale is the same however the summary information for a sale will differ dependant upon perspective. These different views of the same data and governed by a set of rules and definitions that describe how the master data can be viewed and summarised in different ways.

INFORMATION DELIVERY

The information delivery process is about delivering information consumers using a variety of different mediums depending upon their need. At Information By Design, we deliberately use the term Information delivery as opposed to reporting as we believe that information can be consumed in more ways than this e.g. dashboards, Balanced Scorecards, Ad-Hoc analysis and feeds to external bodies.

Data Acquisition Process

Information delivery can be broken down into four basic categories, viewers, casual users, functional users and super users.

Viewers

Viewers expect information to be predefined and ready to go. They know the information that is relevant to them and do not want to have to hunt for it and would rather have it delivered directly to them. This may vary from an email report in the morning or via an intranet or even directly to their smartphones.

Casual Users

In addition to the privilege of a viewer, a casual user has the ability to refresh report information and/or the ability to enter desired information parameters for the purposes of performing high level research and analysis.

Functional Users

Require access to functional data to perform detailed research and analysis.  In addition to the privilege of a casual user, a functional user has the ability to develop their own ad hoc queries and perform online analysis of the data will also distribute the data and analyses to casual users and viewers.

Super User

Has strong understanding of business and technology to access and analyse transactional data as well as the data marts. These users generally have full privileges to explore and analyse data with applications available to them.

©Copyright 2010 Information By Design Pty Ltd. All rights reserved