Ravi Rao | March 29, 2018

To Move Past the Competition, an Analytics- Based Approach to Data Quality is a Must 

Repeat business and customer loyalty are cultivated through organizations’ continued efforts to uncover relevant insights about customer behavior and translate those insights to intelligent actions that bring value to customers. The ability to do this quickly and repeatedly is key to maintaining an edge in a highly competitive landscape. As the variety, volume and the frequency of data available for analysis continues to increase exponentially, there are, at a high level, two main challenges in being able to successfully apply and leverage analytics in a timely manner. One challenge is in the technology of being able to ingest, process, store and analyze the data. There are various mechanisms, tools, and solutions to help manage this challenge and this topic can spawn a detailed discussion of its own, which is not what this blog is about.

The other challenge that is often overlooked and, as a result, inevitably proves to be a significant damper on analytics ROI, is that of ensuring good data quality and governance. Without both elements—quality data and quick access to analysis—organizations will lack vital and timely information regarding customer behavior patterns.

Data quality standards haven’t declined; if anything, as the amount of data available has increased, the highly competitive nature of today’s on-demand economy has raised the bar when it comes to the need for accurate and reliable data.  However, the vast amounts of data have made it a lot more difficult to enforce the data quality standards using traditional approaches. Data errors will never be completely eliminated, but to avoid issues from the merely disruptive to the flat-out disastrous, organizations are compelled to embrace a culture of data quality and governance, along with the enabling technologies to ensure its success.

Re-evaluating Traditional Approaches to Data Quality and Governance

The traditional approach to data quality has been multiple controls and audit checks which verify the data at different points as the data flows through the customer life cycle. These controls and checks are typically configured by users in one or more solutions/tools. Once configured, these controls run in an automated fashion on a scheduled or triggered-on-data-arrival basis. Assuming that the controls are deployed at the right points and assuming that the environment is a fairly static one with controlled changes, this approach works well. When there is a change, such as a new core system being introduced or upgrades to existing systems, the controls can be updated by the appropriate users to reflect the changes.

Well, you probably noticed there were two big and unreasonable assumptions I made a couple of sentences ago. In the current-day enterprise environment, we all know that the sources and volume of data, the underlying systems, the customer base and the need for analytics is rapidly and continually changing. When there are changes, competitive drivers require changes to be implemented quickly, not leaving enough time for a controlled process to be followed. So, in other words, traditional approaches to data quality and governance need to be re-evaluated.

Using Analytics to Push the Boundaries of Data Quality

We all agree that data quality is an important pre-requisite to effective analytics. So, how about leveraging analytics to ensure data quality? The same types of analytics mechanisms that deliver valuable insights into customer behavior – machine learning, pattern matching, modeling – can and should be used to introduce a new and sustainable approach to data quality. As organizations change, organically or through acquisition, advanced analytics can be purposed to, among other things, continually and automatically:

  • Discover the points within the system/data infrastructure that need to be monitored
  • Discover changes in data and metadata content
  • Apply existing controls to new or changed data that is similar to or the same as existing data
  • Determine and apply new controls to new data that is not similar to existing data
  • Evaluate data quality levels/scores and alert when there is significant change
  • Develop a self-learning loop to facilitate continuous self-improvement

Staying Ahead: An Analytics-Based Approach to Data Quality

Recognizing the importance of data quality for not just analytics, but for the overall good of the organization is now a necessity. However, just the recognition, without the means to adopt an analytics-based approach to data quality that will fulfill the needs of the organization, will not suffice. As organizations advance their analytics capability to gain and hold on to competitive advantage, they will have to apply the same analytics to the data governance and quality strategy. Machine learning, pattern recognition and advanced modeling, among other techniques, should be effectively purposed to fulfill the strategy.

By committing to an analytics-based approach to data quality, businesses can ensure that their data assets are being utilized to drive better decisions, improved customer experience, and positively impact revenue and profitability.

To learn more about automating data quality to unlock insights, download this datasheet.

Download the Data Sheet

 

Subscribe to our Blog!