Data quality management is one of those concepts that is hard to define and hard to put into practice. That is why we have put together a simple 5-step guide with everyday examples to help clarify what data means and how to get a grip on its quality.
An online search for the term ‘data quality management’ yields – among others – a link to techopedia.com. The site defines data quality management as ‘an administration type that incorporates the role establishment, role deployment, policies, responsibilities and processes with regard to the acquisition, maintenance, disposition and distribution of data. In order for a data quality management initiative to succeed, a strong partnership between technology groups and the business is required.’
Unfortunately, this definition does not really offer any suggestions on how to go about it. That is why we have put together 5 easy-to-understand steps towards optimal data quality.
Step 1. Define the kind of data to be collected and the elements within that data which should be optimized
Step 2. Define rules for each data element and automate the control mechanisms
Step 3. Assign the responsibility for the optimization of a data entity to a single person
Step 4. Automate data validation according to their definition, and keep the results
Step 5. Correct exceptions and adjust their definition if necessary
How does it work?
Data is extracted from the data source of an application (step 1). This data is automatically evaluated by the Data Quality Monitor (step 4), according to the business rules resulting from the data definitions (step 2). Exceptions to those business rules are detected, collected, and presented to the data owner (step 3), who can now take appropriate corrective action, either by correcting the data, or by fine-tuning the data definitions.
By repeatedly going through this process, a continuous improvement of the quality of the data, and of the optimization process leading to it, is achieved (step 5).
A real-world example: NedZink
NedZink is a good example of a company that has committed to optimizing the quality of their data.
A number of their larger DIY customers had asked NedZink to exchange article data through EDI. It did not take long for NedZink to realize that the quality of their master data did just not cut it for this kind of operation.
Their data was appropriate for internal use, where it was usable despite a few extra handling steps, but an external party would not accept that.
NedZink decided to go all the way and to make data quality optimization a standard step in their process. The recurrent cleansing, in combination with the Data Quality Monitor, have dramatically increased the quality of the data.
The obtained level of quality and the speed at which the project was finalized made that NedZink was ready for electronic data exchange long before their own customers were.
“ We are most certainly pleased by the display of expertise and know-how by Type2Solutions. While many IT projects tend to be tedious, the data management project here was really painless. The data quality dashboard was set up in very little time and enables us to analyze the situation at a moment’s notice. Thanks to the involvement and proactivity of Type2Solutions, we have gained back the control and understanding of our business. The data makes sense again, which provides clarity, relief and structure. ”