The value of a data governance program can be simplified into two factors:
We’ll save the first point for another day, but even perfect decisions become ineffective if they’re poorly executed. Many governance decisions lead to correcting flaws in the data. This step is broadly referred to as “data quality,” but all too often it’s disconnected from the governance decision-making process. Business teams huddle together to make a call based on how data needs to be treated, and then they hand the diagnosis to someone else to implement the prescription. This segregation results in a lack of transparency between teams, unnecessary waste, and poor visibility into the effectiveness of governance decisions. In this blog post we look at three actionable steps you can take to get more value from your data governance program by streamlining your approach to data quality.
If data governance is the recipe for how to protect and extract the value from your data, then data quality execution is making sure all of the critical data ingredients are up to standard and ready for use. Imagine you were a chef asked to prepare a meal for some highly particular guests, but you didn’t have any idea of what your guests wanted to eat, how much they wanted, when they wanted it or their specific dietary restrictions. You would be pretty nervous serving a meal without understanding the requirements your guests expect you to follow. This is the life of a data quality team who is asked to implement data governance decisions without an understanding of why the decision was made, what triggered the decision or having context behind business rules, policies and processes.
Far too often, business requirements are tossed over to data quality teams who must translate them into implementation rule logic. Integrating data quality execution into the data governance decision-making process, ideally within the same platform, allows data quality teams to have a much greater contextual understanding of the data decisions that are being made. Providing a link into how data is impacting the business will allow a data quality analyst to understand how an ingredient (data) needs to be treated (cleansed, monitored, etc.) to best serve the requested meal (business outcome).
As the saying goes, complexity is the enemy of execution. The more barriers that are put in place from the time a decision is made until the time the decision is implemented can dramatically affect progress. This is where fast-moving teams lean on three principles to do more with less.
First, keep your implementation workflows as simple as possible. This means remove as many steps as needed until the process breaks, and then add a step back in. If you’ve followed the first point above, then your governance and data quality workflows seamlessly skip across your integrated platform.
Second, leverage advanced automation methods to reduce the amount of heavy-lifting for data quality monitoring and analysis. We’re in an age where intelligent data quality solutions can work while you sleep. For example, machine learning can be used to score the likelihood of possible inaccuracies based on historical data characteristics and issue reconciliations.
Third, leverage pre-existing content before taking the last resort of creating something new. Prepackaged rule libraries, frameworks, workflows and accelerators act as virtual booster rockets to help you swiftly move from decision into action. Combined together, these principles allow teams to do 10x the amount of the work in a fraction of the time.
Building trust comes down to aligning expectations and demonstrating accountability to the results. Having measurement scorecards that allow data, business and IT stakeholders to set goals together and review the measurement progress towards the goal will do wonders for your governance program. Tactically speaking, having dashboards that connect data, data trust and the business impacts that stakeholders expect, can have huge effects.
For example, one data governance program in a Life Sciences organization struggled to get funding to help resolve issues they knew were creating inaccurate sales reports. At that time, company leadership was considering launching a new product line based on patterns they were seeing from new customer growth metrics. The governance and quality teams became aware of the initiative and were able to build a quick prototype that exposed a significant portion of customer duplicate records were actually being misidentified as net-new customers. Bringing light to a potential disastrous business decision based on poor data quality gave the team the leverage to receive funding for additional tools and resources. From there, they built a “trust scorecard” onto the executive KPI report that embedded the underlying data, definitions and data quality scores from the governance platform.
Success with data requires a well-designed and sustainable data governance program that features an integrated, comprehensive data quality approach. Streamlining the implementation of governance decisions means organizations can execute in a lean, agile fashion while staying aligned on the outcomes and results the business expects.
Are you looking for additional information about data quality powered data governance? Check out the white paper above and below. https://www.infogix.com/resources/trustworthy-data-depends-on-enterprise-data-quality/.
For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.