What is Data Quality?

Jeffery BrownApril 15, 2020

White Paper: Trusthworthy Data Depends on Enterprise Data Quality

According to, Gartner, the leading technology analyst firm, the average large organization loses $15 million annually due to low-quality data.

Organizations often times find themselves asking which is more important: data integrity or data quality? Poor data quality has stricken organizations since the beginning, but the primary reason data quality remains a challenge is because few businesses understand what data quality encompasses. Some organizations use language like data integrity to talk about the validity of data and use the term data quality to describe the completeness, accuracy and timeliness of data. In reality, business users must use the terms interchangeably because to validate data, they must first understand if the information is complete, accurate and timely.

Data quality is more vital today than ever before. Why? Because of the distributed nature of today’s workforce and complexity of how business systems have matured. With so many data consumers working remotely, ensuring data quality as they pass around and consume critical information is of the utmost importance. Additionally, data quality is necessary for data analytics, used to achieve business objectives, enhance operational efficiency and drive innovation. However, even the slightest error in information or a lack of trust in data among business users can completely derail analytical initiatives enterprise-wide, making data quality a top priority.

Therefore, companies need a single, precise definition of data quality before they build a strategy to mitigate risks that impact analytical goals.

Data Quality Defined 

Data quality is defined as establishing trustworthy information that data users across the organization can rely on to achieve various goals such as improved operations, increased business efficiency, compliance validation and more.

Data quality is about data trust. Can enterprise business users trust their data to make important business decisions? Can IT trust their information to prove regulatory compliance? If data users cannot trust their data, businesses do not have enterprise-wide data quality.

The Five Dimensions of Data Quality

There are five critical dimensions to data quality that users must understand and measure, including:

  • Completeness: Confirm there are no missing values, empty fields or missing information from the data.
  • Conformance: Make sure data fields follow a specific pattern and that it is the right value type for that field. Then cross-reference fields with a list of valid field types.
  • Consistency: Affirm the data makes sense in relationship to other data assets.
  • Accuracy: Ensure the data accurately describes the fields it refers to.
  • Timeliness: Verify the data is up-to-date or relevant for its specific purpose.

Challenges of Data Quality 

Once organizations foster a complete understanding of data quality among all users, they can start to analyze data quality gaps and challenges and also how they can solve each issue.

For example, incomplete or inconsistent data at the source increases risk when moved to target systems. Organizations must place greater emphasis on data integrity at the source.

Organizations ingest data from third-party sources, introducing additional risk into their data supply chain. Businesses must apply data quality checks to all new data that enters a company’s data supply chain. Instituting regular checks for accuracy and completeness provides reconciliation between systems to ensure the quality levels of all external data.  This will also help establish more beneficial service level agreements by holding data providers accountable.

Many companies have complex IT infrastructures and even more complicated data transfer processes. As the number of platforms, applications and volume of data increases, data transfer becomes impossible to maintain high quality data. Organizations must consolidate systems where appropriate and apply the proper rules to prevent errors related to data structure logic and loading processes.

Improper formatting, blank fields and transformation errors can prevent data from loading properly onto target systems. In addition, updates to data are always occurring. An enterprise data quality program with automated data quality rules enables data transfers to happen seamlessly and quickly flags any data if there is incorrect input, to prevent those errors from impacting regulatory compliance reports.

Turning data into meaningful, actionable business intelligence means businesses must take a step back and realize everything data quality encompasses, including its meaning and overabundance of challenges.

When organizations take the time to understand data quality, its challenges and collaborate across the enterprise to create a tailored integrity plan, businesses can deliver quality data to meet business objectives.

For more information about establishing data quality, read this article in Forbes from our CEO, Early Stephens.

Are you looking for additional information about data quality? Check out this white paper above or below for more detailed information on data quality.

Get Insights

For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.

White Paper: Trusthworthy Data Depends on Enterprise Data Quality