In January 2016, Solvency II, the risk-based European supervisory framework for insurance, became applicable. The regulatory framework was created to help insurers develop and effectively operationalise governance and risk management practices across three pillars.
To achieve Solvency II, many insurers have developed additional layers of End User Controls (EUC) by developing User Defined Applications (UDAs) on tried and trusted technologies such as Access and Excel. While this approach provides the initial tick-in-the-box for compliance, it offers little in terms of on-going sustainability, transparency or business value for the effort invested. While Excel and Access get the job done for day one compliance, they add risk and require many manual processes which make any on-going maintenance costly and prone to error – neither of which is welcome in a compliance process.
These challenges are further compounded by the disconnect that exists between data quality controls implemented to ensure the accuracy, completeness and appropriateness of internal model data and the governance of those data attributes through the implemented data directory.
These were among the challenges identified by the Prudential Regulatory Authority (PRA) in the UK within their data review published in February 2016[i] with regard to firms Internal Model Approval Process (IMAP). The review had ten key findings, the majority of which focused on core data governance principals of ownership and communication, as well as data quality controls.
From a data governance perspective, the Solvency II framework is intended to ensure that data quality standards are developed that apply to all data used to operate, validate and develop the internal model, including external data. To achieve this, the insurer should compile a data directory, specifying source, characteristics, and usage of the data.
Having identified the data used in the internal model, the insurer needs to undertake data quality checks to ensure the data is accurate and complete.
While the minimum requirement for the data directory and associated data policy is to cover the internal model, the extension to cover wider data flows that feed the internal model, including source systems and databases should be considered.
The data directory should assist in creating an understanding of the uses for, and materiality of each data item, to the final modeling processes and result. Be sure that activities such as data cleansing are focused where material benefits can be obtained. Within this process, the lineage of the data is important to explain the journey the data goes through beginning at the original entry for use within the internal model.
While the above outlines the aspiration and intent of the framework, the reality is that a disconnect exists between governance and control. So how can insurers deliver real value for the investment made to date, and ensure future investments aren’t relegated to another tick-in-the-box compliance exercise?
By following the above steps, an organisation can truly transform the effectiveness of data governance over the internal model and the wider business processes to drive real value and move away from just another tick-in-the-box mandated activity.
To learn more about how to liberate your data with data governance, download the white paper below.
For a deeper dive into this topic, visit our resource center. Here you will find a broad selection of content that represents the compiled wisdom, experience, and advice of our seasoned data experts and thought leaders.