Never has the world of reinsurance been faced with as many pressures as it is today. Start with legacy issues such as mold or asbestos, add emerging exposures such as nanotechnology or climate change, throw in a pinch of regulatory issues such as Solvency II and, of course, stir in the economic condition and the impact on the overall insurance industry. What you end up with is definitely not a piece of cake.

Years of soft market conditions followed by sudden and sharp hardening periods, tremendous losses from man-made and natural catastrophes and open-ended liabilities such as nanotechnology and terrorism are putting severe pressure on reinsurers.

The market is hardening in the reinsurance sector or, at least, the soft market has bottomed out. Whether rates rise in January 2009 or 2010, reinsurers expect an increased demand for reinsurance, and hope to increase rates by at least 10% or more in certain lines at the January 1 renewal. While some reinsurers may shield themselves from negative cash flow by renewing the book with minimal increases, by 2010, market rates will have to go up as capital is scarce. Although most insurers and reinsurers have experienced few direct hits from sub prime mortgage investments, the subsequent fall in stock markets has prompted write downs, and has depleted investment income at insurers, gobbling significant portions of their equity capital.

Post crunch, fundamental issues will be examined. In the United States, a perceived failure of risk management, control and supervision despite SOX may lead to an increased scrutiny on risk controls, leading to even more stringent capital and liquidity requirements. And in Europe, Solvency II, scheduled for a 2012 implementation date (at press time), requires utilization of data from a broad range of sources to measure and monitor risks driving a risk-based economic framework, including risk-based capital requirements. Firms will need proper processes in place for identifying and quantifying their risks in a coherent framework.

In an increasingly complex, interlocked and fast-moving world, where should a reinsurer best deploy capital in their IT investments? Like primary insurers, reinsurers are experiencing what Novarica terms the "frostbite effect" in IT strategy. Reinsurers are pulling in from the edges to protect their core capabilities to remain viable and competitive in a challenging market. These core areas are data and modeling, compliance and control, and operational efficiencies.


We all know you can't model everything, and modeling certainly isn't a perfect science. Who would have thought that Hurricane Ike not only would go through Texas, but create storm surges clear up to Ohio? But with these kinds of changing exposures underway, use of catastrophe models is critical to price and manage catastrophe or event excess-of-loss reinsurance.

With any kind of modeling or analysis, the output is only as good as the data input into the model. At the end of the day, those reinsurers who focus on data capture, data accuracy and data completeness not only improve their own profitability, but also prepare themselves for increased scrutiny of their own books as increased regulation looms on the horizon. Reinsurers need to obtain more complete and accurate data from primary carriers, focus on data cleansing, and continue to utilize sophisticated analytical tools to find new opportunities for creating competitive advantage.


Whether a reinsurer is likely to be directly impacted by Solvency II or not, increased scrutiny from regulators is coming. Improved data collection and the ability to utilize that data is a key initiative most reinsurers will need to continue to improve in order to meet regulatory standards.

Reinsurer IT groups should collaborate with risk management and finance departments to proactively prepare for more frequent data calls asking for more detailed levels of data to assure compliance with new risk based capital requirements. They should look toward initiatives focused on data integration, analysis and modeling. For some, new reporting or accounting systems will be warranted to create data availability, and to improve the efficiency of auditing.


Just as important as improving risk analysis and decision-making capabilities is improving workflow and processing efficiency. Most reinsurers are still working with paper-based files, which are extremely labor-intensive, full of inefficiency and lack data quality. Manual processes typically lack abilities to provide integrated views of customers, provide inadequate auditing information and have a high risk for errors. Unlike primary carrier transactions, the reinsurance transaction is a low-volume, high-value transaction, so it is more difficult to make a compelling case for implementation of straight-through processing.

An increasingly prevalent choice is the use of a reinsurance exchange such as RI3K or eReinsure to exchange information in an auditable and structured form. While the exchanges are unlikely to replace traditional reinsurance relationships, they enable brokers and reinsurers to more efficiently exchange large amounts of information in a more structured and auditable form. Electronic placement reduces the likelihood of inaccurate data, improves data integrity and dramatically reduces cycle time over traditional paper submissions, and having an electronic trail of the reinsurance transaction can help companies comply with increased regulatory requirements. All that and reductions in the cost of transactions makes reinsurance exchanges an increasingly popular option in the placement of reinsurance.

The promise of improvements in productivity, customer service and efficiency is creating an emergence of interest in workflow applications in reinsurers. Embedding a rules engine into the underwriting process allows a reinsurer to reduce the number of low-value tasks employees perform, and allows them to focus on more complex aspects of reinsurance. Rules-based underwriting software systematically captures data, both for immediate underwriting needs and for long-term data analysis. Consistency in processing delivers more accurate data, and automation reduces the time necessary to perform the work.

In this uncertain market, reinsurers that excel in data management, compliance and control, and operational efficiencies will have an advantage against their less advanced competitors.

Karlyn Carnahan is a principal in Novarica's insurance practice, focused on e-business, marketing and IT planning for insurers and reinsurers. She can be reached at kcarnahan

(c) 2009 Insurance Networking News and SourceMedia, Inc. All Rights Reserved.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access