Tales of What Works and Doesn’t to Achieve Enterprise Data Accuracy

Learn Why Mandates Don’t Result in Enterprise Data Accuracy

Amber PiferJune 8, 2017

Download eBook

DATA INTEGRITY MANDATES:  WHY THEY AREN’T ALWAYS A GOOD THING

Having worked with customers struggling to overcome data integrity and quality issues for more than 8 years, I’ve seen my share of what works and what doesn’t work from the school of hard knocks. That’s what got me thinking about how to synthesize these experiences into what could be an ideal framework to improve data integrity and quality.

An analogy to describe this framework is like a fictitious town called Pleasantville, where the homes are business rules, the schools are the frameworks and the roads are the processes. However, life isn’t like Pleasantville, and by following some simple rules won’t result in your company achieving enterprise data accuracy.  Experience is a tough teacher, which is why my belief in following what I once considered a tried and true model was challenged. Nevertheless, all isn’t lost as I still believe that many of the characteristics that I envision for a successful framework still hold true. For example, executive backing of data integrity is always an indicator of which organizations will weather the storm to overcome setbacks. The same holds true of having a committed group, or Center of Excellence, dedicated to the ongoing maintenance and growth of a data program. But as we all know, the devil is in the details.

MANDATES ARE GREAT, BUT THE CONTENT IS CRITICAL

For years I thought that the end goal for any customer with data integrity challenges was to create an internal standard for controls. I’ve seen a number of iterations of this: mandates for some sort of balancing whether it is automated or manual, requirements with a single set of solutions and other times a treasure trove of solutions to create a “controls” toolbox, etc.

What now perplexes me is the actual need to call out mandatory control capabilities within a chosen solution. That’s the equivalent of saying every computer has to come with movie editing software. The reality is that few people need that type of capability. By insisting on mandatory capabilities with automated controls, you’re potentially overlooking the capability for the solution to perform the work that actually has to be done to ensure the data you need is accurately verified, balanced and reconciled.  Data validations should be as unique as the processes they are affecting, and as such, should be applied in a methodical manner based on requirements rather than strict policy mandates.

IS THERE SUCH A THING AS CONTROLS OVERKILL?

To carry on from my previous point, it can be easy to forget the value of data validations when certain expectations are forced upon users.  Each set of “controls” that are implemented in a process do, in fact, add workload to business or application owners.  It is critical to ensure that only the truly necessary controls are put in place so that those supporting and utilizing the solutions are not bogged down by unnecessary work.  Analysis and prioritization of control gaps will help to determine which controls are most useful.  “Is there a high risk of duplicate data in the process?” If the answer is “no,” the next question could be, “what would be the result of duplicate data being processed?”  If the outcome isn’t critical, it may not make sense to add a duplicate check to a process…even if it is simple to add.

DON’T GET STUCK IN THE BOX

How many of you have a somewhat newer vehicle, and have actually read the owner’s manual from cover to cover?  My guess is not many.  Most people buy a new car and use it strictly for getting from point A to point B. Do you really know what all the lights on your dashboard mean?  Can you recite all the newest safety and convenience features of your vehicle?  Then I’d bet you’re not getting the greatest value from your ride. If my ideas were running parallel, then why would you only want to benefit from just the bare minimum capabilities of a data integrity solution?  That’s just the tip of the iceberg, dig deeper!

Calling out required data validation checks with a data integrity mandate runs the risk of falling into a state of mediocracy.  If an organization is forced to execute the same types of checks over and over again for all projects as part of their System Development Life Cycle (SDLC) process, the chosen tool will begin to morph into a limited tool with only a handful of capabilities.  Keeping the play book open to users will require project members to truly understand the full breadth and depth of the solution at hand.  For example, generalizing project requirements may help users realize the extended capabilities of a data integrity solution beyond standard summary-level balancing and detailed reconciliation.  What about data quality (type conformance and value conformance), threshold checks, file monitor capabilities, and more?  At the end of the day, we may all have a happy path for reaching the end of the rainbow to find the pot of gold – or in this case a truly successful data integrity solution.  To do so, I’d recommend testing the boundaries of the solution you invested in to truly get the return you deserve.

To learn more about achieving data integrity, download the eBook below to ensure you know how to manage data quality issues.

Get Insights

For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.

Download eBook