Don’t Overlook the 4 Key Data Governance Approaches to Help Achieve Solvency II

Learn how data governance can help achieve Solvency II

Stephen LinsleyApril 20, 2017

Download White Paper

In January 2016, Solvency II, the risk-based European supervisory framework for insurance, became applicable. The regulatory framework was created to help insurers develop and effectively operationalise governance and risk management practices across three pillars.

To achieve Solvency II, many insurers have developed additional layers of End User Controls (EUC) by developing User Defined Applications (UDAs) on tried and trusted technologies such as Access and Excel. While this approach provides the initial tick-in-the-box for compliance, it offers little in terms of on-going sustainability, transparency or business value for the effort invested. While Excel and Access get the job done for day one compliance, they add risk and require many manual processes which make any on-going maintenance costly and prone to error – neither of which is welcome in a compliance process.

These challenges are further compounded by the disconnect that exists between data quality controls implemented to ensure the accuracy, completeness and appropriateness of internal model data and the governance of those data attributes through the implemented data directory.

Top 3 Challenges

These were among the challenges identified by the Prudential Regulatory Authority (PRA) in the UK within their data review published in February 2016[i] with regard to firms Internal Model Approval Process (IMAP). The review had ten key findings, the majority of which focused on core data governance principals of ownership and communication, as well as data quality controls.

  1. Data Ownership – Ownership and mapping of data lineage is a challenge, impacting user community’s ability to understand the changes.
  2. Data Directory – With the intent of using the data directory as a knowledgebase for users to gain insight and understanding of the data used in the internal model (e.g. its source, use and characteristics), the complexity of the process established made it unsustainable and therefore effectively unfit as part of BAU operations
  3. Data Quality Controls – Demonstrating a holistic, end-to-end controls framework from source to internal model remains problematic.

From a data governance perspective, the Solvency II framework is intended to ensure that data quality standards are developed that apply to all data used to operate, validate and develop the internal model, including external data. To achieve this, the insurer should compile a data directory, specifying source, characteristics, and usage of the data.

Having identified the data used in the internal model, the insurer needs to undertake data quality checks to ensure the data is accurate and complete.

  1. Accurate – Relates to the degree of confidence that can be placed in the data. Data must be sufficiently accurate to avoid material distortion of the model output.
  2. Complete – Databases provide comprehensive information for its purpose/use
  3. Appropriate – Data does not contain biases, which make it unfit for purpose

While the minimum requirement for the data directory and associated data policy is to cover the internal model, the extension to cover wider data flows that feed the internal model, including source systems and databases should be considered.

The data directory should assist in creating an understanding of the uses for, and materiality of each data item, to the final modeling processes and result. Be sure that activities such as data cleansing are focused where material benefits can be obtained. Within this process, the lineage of the data is important to explain the journey the data goes through beginning at the original entry for use within the internal model.

The Disconnect between Governance and Controls

While the above outlines the aspiration and intent of the framework, the reality is that a disconnect exists between governance and control. So how can insurers deliver real value for the investment made to date, and ensure future investments aren’t relegated to another tick-in-the-box compliance exercise?

  1. Data Directory Sustainability – Move the data directory from a UDA-type process to a strategic platform that can scale, as well as ease the maintenance burden. Socialise the data critical to the internal model, by utilising controls that exist around it and the ongoing assessment of the data quality to a wider audience.  With the right tool it should be possible to achieve this as a simple data migration exercise from the existing UDA.
  2. Data Ownership – Once the data directory is stored in a more strategic platform where the business glossary has been established and the data appropriately classified, it will be much easier to establish data ownership, moving from a single resource, or pool of resources, whose responsibility it was to maintain the data directory, to truly engaging business/process owners and making them stewards of their data.
  3. Data Lineage – The data directory must provide the opportunity to trace data and resources back to originating sources. This is, however, cumbersome, if not impossible, to effectively maintain within a UDA. It is therefore imperative that the chosen tool has the ability to graphically visualise the lineage and automatically map the underlying technical metadata. This will enable the lineage to address the primary business concern which is to quickly assess the potential impact of proposed data changes and issues within an organisation.
  4. Data Controls
  • Ensure it is clearly understood what is meant by the terms “accurate,” “complete,” and “appropriate” in the context of your organisation.
  • Ensure quantitative and/or qualitative criteria are established for all data sources associated with your organisation’s approach to the data quality dimensions. The attainment should be a prerequisite for its use within the internal model.
  • Automate the established controls, ensuring they are appropriately imbedded within the critical processes and that visibility to the results is available at the operational level to ensure quality improvement initiatives can be effectively undertaken in terms of both immediate remediation and on-going improvement.
  • Align controls with the data directory to provide automated feeds to update quality metrics and thereby providing transparency for the business as to how the quality is being assured within the internal model attributes, and their “fitness” for use.

By following the above steps, an organisation can truly transform the effectiveness of data governance over the internal model and the wider business processes to drive real value and move away from just another tick-in-the-box mandated activity.

To learn more about how to liberate your data with data governance, download the white paper below.

 

Get Insights

For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.

Download White Paper