We’ve said it before, but the message bears repeating: data produces value if and only if it is accurate and trustworthy. There is nothing more critical than having high quality data. Poor quality data can completely undermine any analytical initiatives, and lead to inferior outcomes and flawed business decisions.
Most organizations know this by now, but poor data quality continues to derail big data initiatives across every industry. Organizations understand that having quality doesn’t just happen by accident, but recognizing the problem does not produce a solution either, and many data quality approaches continue to fall short. The culprit is common misconceptions, causing some organizations to mismanage their data quality efforts. Below are a few common data quality misconceptions that organizations stumble over when implementing data management strategies.
The problem with data quality is that categorizing it as “bad” or “good” can be a moving target. If data is simply inaccurate, then there is general consensus that it constitutes “bad” data, and most organizations are all too familiar with this type of data quality issue. In fact, on average, organizations around the world believe that 27 percent of their current data is inaccurate, according to the Experian Data Quality benchmark report. But what if data is 100% accurate, but five years old? Such data would often also be considered poor quality by business users because it isn’t timely, but may be used in other ways. In Data Quality: The Software Dimension, software expert and author Jack Olson wrote, “Data has quality if it satisfies the requirements of its intended use.” Take, for example, a business user’s need to examine online purchases by product type. The data set they have is missing half of the customers’ geozip codes. Is it poor data? Not for their purposes. But if they needed to analyze consumer buying patterns by region, it certainly would be.
That same Experian benchmark report also stated that only 44 percent of executives trust their data enough to make important business decisions. But perhaps this lack of trust is a symptom of a lack of data understanding. Data quality is absolutely critical, but it is equally critical that business users understand data assets and how they can be used to maximize the quality of the insights they produce. Unfortunately, the report found that 52 percent of executives are relying on gut feelings about their data when deciding if they should trust it or not. But through proper data quality and data governance, they can trade instinct for intellect because they will know the lineage, age, and quality of each data asset before any business decisions are based off it.
The market for data quality tools has undergone rapid maturation as data has proliferated and the issues with big data quality have come into focus. Some organizations are still relying on manual processes to improve data quality, a time and resource-intensive ordeal that involves manual double and triple-checks of data, hoping nothing has been missed. However, with the volume of data exploding, such approaches are quickly becoming untenable for most organizations. Instead, organizations have a multitude of data quality tools at their disposal, typically deployed internally in organizations’ IT infrastructure, but available as hosted solutions as well. These tools can include parsing and standardization, data cleansing, profiling, monitoring and data enrichment.
As analytics gain traction and recognition for the transformative impact they can have on business decision-making, less attention has been given to their ability to improve data quality. This is in part because many organizations are using old retrofitted tools for data quality checks and monitoring, and those tools aren’t technologically advanced enough to govern data or embed analytics into data quality checks.
Layering analytics into the data quality process can revolutionize the way data quality checks are performed. By performing analytics and machine learning in concert with data quality business rules, organizations can substantially improve the efficiency and effectiveness of their data quality initiatives, greatly improving overall enterprise data quality.
A common misconception that organizations have when it comes to data quality is that it represents a set of challenges distinct from those addressed by data governance, and should therefore be handled with a discrete set of tools. On the contrary, as mentioned earlier, governance is fundamental to defining data quality, and can be an integral part of a comprehensive data governance strategy to maximize both the value and utilization of organizational data assets.
Data governance is fundamentally about understanding data and defining its source, meaning, use, and ownership. These are all things that can impact how data quality is scored, so it makes sense that these would be closely connected. Data governance also involves the clarification of any potential quality issues, including missing information, questionable lineage, etc., through workflows and exception management for investigation and resolution with data owners, stewards and other stakeholders. In this way, governance is closely aligned with iterative data quality improvement.
To ensure data quality, organizations need a suite of data quality, governance, and analytics capabilities that can work in concert to enable better control over data and help solve data quality issues. The solution should deploy data quality validation including data profiling, completeness, consistency, timeliness, reconciliation/balancing, and value conformity. In addition, the solution should combine data quality checks with machine learning analytics and governance, so organizations can gain command over their data from both a business and IT perspective.
To learn more about an all-inclusive data quality, data governance and analytics solution, download the data sheet below.
For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.