Advertisment

'Solvency II vital in business transformation'

author-image
CIOL Bureau
Updated On
New Update

Solvency II compliance is one of the top most things on the agenda of European insurance companies. Most of the insurance companies are long way into their Solvency II implementations with the aim to go-live before Nov 2012, which the regulation comes in force. However, a lot of companies have still not made a start.

Advertisment

The requirement of the directive is for insurers to demonstrate that they hold sufficient capital, are able to manage their risks, and are able to protect the policyholders even in adverse conditions.

For many of the insurance companies, Solvency II necessitates significant business and technical transformation of their risk management systems. Technology initiatives will primarily include a more rigorous risk of governance framework, refinement of the internal models, and more robust and transparent reporting systems. The typical Solvency II architecture can be represented as shown below:

publive-image

For large insurers, IT will play a central role in ensuring that requirements for all three pillars of Solvency II are met comprehensively. As part of our work done in supporting companies with Solvency II strategy, we have uncovered numerous information management challenges.

Advertisment

Data Standardisation

Many organisations have different ways of referring to the same set of data across various teams, and this causes problems in understanding, reuse and maintenance of data across the firm. There is a lack of consistent and single metadata and reference data repository which makes it difficult for capital adequacy calculations to be represented and reported meaningfully and comprehensively to internal stakeholders as well as external regulators.

{#PageBreak#}

Advertisment

Reliability of the Internal Models

Internal models are the beating heart of Solvency II implementations. In the absence of robust internal models, companies will have to adhere to standard formula to calculate their capital requirements which may be significantly higher than those calculated through internal models. Actuaries are responsible for developing models which can provide them with an ability to underwrite the most risk with the capital they have.

But due to the sophistication and specialism involved, there is very little oversight possible on the modelling activity. This has been a big problem traditionally; insurers are developing riskier and less reliable models to be able to expand their business, and some of them have been exposed during unfavourable economic times.

Advertisment

Regulators expect more transparency and traceability into the inner workings of the actuarial team. This demands a greater degree of documentation, stricter data governance, clearer annotation around the use of various assumption sets, and visible controls.

The models are typically based on a permutation of scenarios and hence generate huge data volumes. All of the data is generally dumped in the data warehouse and ends up getting pushed into the reporting layer. There needs a greater thought about what constitutes reliable realistic runs, and what constitutes simulation runs.

Actuaries can help rationalise their scenarios thereby making it easy for them and the IT team to manage and explain their data and their treatment of the data. Reduced data volumes can also help reduce any data quality issues quickly and support better integration of the risk IT landscape.

Advertisment

There isn’t much time left for insurers to update their internal models. Internal models need to be confirmed in Q1 2011 to be able to calculate the results from 2010 and 2011.

Systems Integration

The IT systems across the risk architecture landscape are not well-integrated leading to a number of manual handshakes, data corruption, and lack of audit trail. There is a propensity to use spreadsheets and desktop-based applications to carry out hugely important computations amongst the actuaries and risk analysts, and thus the data inputs and outputs into such standalone systems is a major integration challenge.

Advertisment

There are a number of reasons for these problems

Poor Governance - Poor Governance is arguably the most important reason for amplifying the IT problems that insurers face with their risk systems. There is a lack of overarching governance that encompasses people, processes and applications. A side-effect of this is inadequate data governance leading to issues like dirty data, duplication of data, lack of audit trail, ignorance about volumes, history, source and utilisation of the data. All of this makes it difficult to manage and more important to report the data accurately, and reliably.

Legacy Apps - Most insurance companies still use legacy applications to support their critical business needs. This most likely is the function of the long durations for which insurance companies have to maintain their business information (e.g. policies, investments etc.), which makes it difficult to redeploy their front-end systems on to new flexible and more suitable solutions. Several data management issues arise due to the continued use of legacy applications.

Advertisment

Poor System Controls - The older data structures in the legacy systems are unable to handle the new business models, new product definitions, or additional customer information. This means that IT has to devise tactical and sub-optimal ways of using the older constructs to service modern day needs. These changes bring their own problems in terms of their limited reusability and lack of transparency and availability to the larger organisation.

Incongruent Data Structures - Controls built in the older systems seem extremely limited now, given the advances in logging, access management techniques and metadata management. Lack of proper controls leads to poor data quality and traceability. As duration for which some of the policies (e.g. annuities) stay active is long, firms are burdened with huge volumes of historical data in their operational systems. Older data tends not to be clean to the extent recent data is, hence data cleansing is a significant issue while pulling the data into the models and further into reporting. This issue is also exacerbated due to the incongruent data structures going back in time.

{#PageBreak#}

Fragmented Delivery Structure

Many organisations are grappling with fragmented delivery structure due to siloed approach towards risk management. Particularly, while delivering Solvency II compliance, this approach tends to create its own set of problems. We have seen a motley group of permanent staff, contractors, multiple service providers, and tool specialists all working incongruously to deliver what is one of the most complex IT transformation programme.

In addition, a number of big insurers have distinct business units — different subsidiaries, associated firms or geographical divisions — which conduct their own risk modelling. This divisional fragmentation creates it own set of problems as all of these business units (even those based outside Europe) have to adhere to Solvency II regulations.

A fragmented delivery structure has a detrimental impact on various aspects of data management — data governance, poor integration between various data stores and data integration systems, as well as poor data requirements gathering and traceability.

It is important to have a senior influential Solvency II programme team to manage all the various teams, business units and ensure clear lines of reporting within each of them. The Solvency II Steering Committee is also needed to establish governance over the programme, and ensure high level concerns are escalated appropriately and addressed at the earliest.

At the same time, it is advisable to probe into the need for multiple solution providers and vendors and asses if there is any way of rationalising them. Empirical evidence seems to suggest that in order a foster a more collaborative and trusted delivery model for programmes such as Solvency II, there should not be more than 4-5 vendors or solution providers at a time.

“Excessive” Use of Packaged Solutions

The theory behind using packaged COTS (Commercial off the shelf) products seems reasonable, but it doesn’t always work out in practice. COTS products are deemed to offer significant savings in procurement, development, and maintenance. However, we have found it may further complicate the problem in large transformation programmes with tight timelines.

At one of our clients, we found that they had 4 different COTS products being implemented at the same time to support the Solvency II delivery. All of the products were closely interfacing with each other, so there was no baseline for each of these individual software implementations; design for each software component was evolving simultaneously. One of the biggest challenges with packaged solutions is their use of data definitions or terminology which is not consistent with what the organisation uses. The tools comes with own way of storing hierarchy, and naming key data entities.

E.g. A packaged asset data management solution may have a different definition of product to a modelling solution. The problem is further worsened if the tool expertise is not already existent with the programme team and we need to get a team of tool specialists onsite, which will add to the fragmented delivery structure mentioned earlier. All the advantages offered by having pre-built software may get wiped out if the programme has to deal with inconsistent data definitions and an “out of place” team.

Implications beyond Solvency II

Challenges highlighted above not only influence Solvency II programmes, but also a lot of other transformation programmes within the organisations. Resolving them as a part of delivering Solvency II compliance will ensure that similar programmes run much more smoothly and efficiently. Solvency II regulations do offer considerable challenges for insurance companies, however, there are considerable benefits in getting things right. It will help companies build a clear view of their product portfolio, refine their internal models, shorten their year-end reporting cycle. It will also allow for new products to be taken to market quicker, underwrite more business, and detect fraud more effectively.

(Abhishek Toraskar is a Manager in a BIG 4 Consultancy and specialises in Data and Business Intelligence domain for large Financial Services clients. The views expressed are all his own)