PricewaterhouseCoopers (PwC)

Countdown to Solvency II:
Data quality: Don’t leave it until last!

With so much of the Solvency II implementation project resting on the quality of the data, now is the time to make data a priority and get preparations on track.

With so much of the Solvency II implementation project resting on the quality of the data, now is the time to make data a priority and get preparations on track.

Quality data is the lifeblood of successful Solvency II implementation and ensures the effective integration of risk considerations into the decision-making that underpins it.

In its proposed ‘Level 2’ implementation measures, the Committee of European Insurance and Occupational Pensions Supervisors (CEIOPS) said: “Given the high importance of data quality standards, CEIOPS advocates gearing interpretation towards a comprehensive scope of data quality requirements.”[1]

To date, the Level 2 recommendations (CP 43 and CP 56) have focused on the data used in the calculation of technical provisions and operation of internal models. Further proposals are likely to broaden the scope by incorporating all the key data used in the running of the business into the Own Risk and Solvency Assessment (ORSA). Data quality management is therefore set to be an ongoing and highly scrutinised process.

Download article as PDF


1. .CEIOPS CP 56 – ‘Tests and standards for internal model approval’, issued in July 2009

This should include a documented record of what is meant by ‘quality’, how this is monitored, what processes are in place to validate and update the information and, where necessary, address deficiencies. Ideally, these procedures should be supported by metrics and agreed tolerances for tracking and upholding quality. The other key foundation is a data dictionary setting out the source and characteristics of the information: who owns it, how it is deployed and how its use may need to be qualified to take account of any uncertainty or margin of error. As Figure 1 outlines, these quality control procedures can be summarised as ‘define, assess, improve and sustain’.

Potential gaps

Many companies have some form of data quality policy and associated monitoring procedures in place.

However, the formal mapping of sources, ownership and other key aspects of data governance may be lacking. Among the suppliers and users of information there is also varying understanding of what is meant by data quality, the tools in place to measure it and what should be done when deficiencies are detected.

The accuracy, completeness and appropriateness of actuarial, transactional and other internal data are generally regarded as fit for purpose. Many companies are looking to enhance consistency and improve MI turnaround by developing centralised data warehouses. What are often missing are systematic processes for managing external data in areas such as market and operational risk, which CEIOPS insists should be equal to the standards for internal information.[2]

Download article as PDF


2. Data quality requirements should apply to ‘all data, irrespective of the source’ from CEIOPS CP 56, issued in July 2009

In the autumn of 2009, we carried out an informal survey of more than 40 insurance professionals to gauge what they see as the toughest challenges their firms face in developing an effective data quality framework (see Figure 2). While the difficulty and calls on resources for mapping information flows were recognised as high, it is notable that external data management was seen as less onerous. It is also perhaps surprising that data assessment and data quality improvement as a whole were only seen as moderately difficult.

The bigger picture

While many companies have so far tended to focus on the data needed for capital evaluation or the IT developments that may be required to support this, we would advocate a broader and more strategic approach that looks at the data needed to comply with Solvency II as a whole and how these developments could be harnessed to enhance MI within the business (see Figure 3). Early engagement with boards and business teams will help to discern how the usefulness of MI could be improved and allocate responsibilities within implementation work streams.

Download article as PDF


All-round solution

With 2012 just around the corner, now is the time to bring data quality to the top of the agenda. This includes looking at what the business needs and how this can be delivered in time. On the flipside, seeing data quality as primarily a technological challenge that can be resolved by investment in data warehouses and other such systems solutions in isolation may fail to provide business teams with access to the information they require. In turn, a tactical response to compliance is unlikely to provide the necessary usability and adaptability to change and may ultimately prove more expensive than a strategic solution.

Giving you the edge

PricewaterhouseCoopers is helping a range of insurers to get to grips with the practicalities of Solvency II implementation. If you would like to know more about how to develop and embed an effective data quality framework, please contact:

David Hush Director david.hush@uk.pwc.com

Paul Smith Director smith.paul@uk.pwc.com

Chris Gullick Senior Manager chris.a.gullick@uk.pwc.com

Download article as PDF


© 2009 PricewaterhouseCoopers. All rights reserved.
PricewaterhouseCoopers refers to the network of member firms of PricewaterhouseCoopers International Limited, each of which is a separate and independent legal entity.