A routine, "fit-for-purpose" data assessment helps carriers plan wisely.

The world is awash in data. Just spend a few minutes on the Internet and you'll find hundreds of companies offering data purported to support your business and bring you value. Only a few years ago, technology designed to collect, warehouse and mine data was considered bleeding edge. Today, those technologies are part of standard operating procedure.

The rapid change in the amount of data, sophisticated software and consulting services focused on business optimization creates opportunities for those companies seeking to dominate their market. The approach that leading companies engage in, and that delivers the greatest value, has three interconnected parts: acquisition, validation and prediction.

These three concepts form the basis of mission-critical, "fit-for-purpose" data strategies. Executives leverage these concepts to gain the most from their investment in business optimization initiatives.

The degree to which a company performs and repeats its assessment of a project's appropriateness on a periodic basis, and readjusts the plan to eliminate substandard results, determines whether it stays competitive, moves to the front of the pack or gets left behind.


The benefits of data acquisition are directly related to fit-for-purpose: the appropriateness of the data for its intended use. Further, the value of that data is highly reliant upon each carrier's business aptitude, and how well the data is mapped and integrated into carriers' business applications.

For several years, competitive carriers have been archiving data captured during the new business process. They also have purchased data from vendors with goal of accessing data at the right point in the underwriting or claims workflow to support critical business decisions.

While data is available from a variety of providers from a wide spectrum of business verticals, inappropriate data and its incorrect application can result in skewed business models and incompatible business decisions. For this reason, optimizing the benefits of data often requires the assistance of expert data analysts.

One of the newest methods for staying ahead of the pack is through a data consortium. An early example of this is the Comprehensive Loss Underwriting Exchange (CLUE) database. This database is one of the first instances of an insurance consortium created to provide the property insurance industry with consumer claim information. In a consortium data-sharing model, each participating carrier agrees to allow a predetermined set of data to be stored and accessed, with appropriate security measures, by other consortium participants. The pool of data from a large number of participants improves the statistical accuracy of data analysis.

Other examples of this type of consortium are being brought to market now. This is especially significant for small- to mid-sized companies, who gain the ability to remain competitive despite the fact that their own market share (and, therefore, database size) is not large enough to use for market observations, product enhancements and evidence-based decisions. Early participants in consortium opportunities gain a knowledge advantage that improves business performance substantially, allowing them to get ahead and stay ahead of the pack of later adapters.

Data sharing is optimized when the data is based on a common platform, data library and layout from within a core vertical market. For that reason, data sharing within the property insurance vertical provides more relevant data, ensures better data quality and provides the necessary data points to drive specific property market-related business models.

In addition, the inherent risks and costs in acquiring third-party data analysis, validation and integration are minimized or removed.


The standard way for property insurers to use data acquired through a consortium or other means is in routine reports that use underwriting and claims data for insight into areas such as agent or staff performance, customer service, cycle time, etc. But, while the reporting tools are sufficient for telling a manager about yesterday, they offer little insight into what tomorrow may bring.

Insurers turn to data analytics methods to transform their current and historical underwriting and claims data into meaningful business intelligence. The next strategic step for pack leaders is to fully embrace multivariate analysis. As the name implies, multivariate analysis describes a collection of procedures that involve the observation and analysis of more than one statistical variable at a time. The lessons learned provide much deeper insight into carrier performance for greater competitive advantage. Multivariate analysis formation crosses traditional departmental boundaries into different operational areas such as claims, underwriting and special investigation units.

For example, a multivariate analysis of average claim severity that includes the underwriting variables of interior finish, year built, building codes and geo-coded location may indicate that homes built prior to 1940 in certain segments of the northeastern United States are desirable risks, while a structure with the same building characteristics in the Southeastern states is very undesirable. Studies such as these help validate predictive models as well as manage claim severity. Additionally, this same type of analysis is used to "flag" specific properties for additional claims or underwriting review and possible remediation, which is another example of data quality validation.


Perhaps the most rapidly changing part of a property insurer's data strategy is the implementation of advanced "risk-based" or predictive modeling to improve bottom-line results. The basic objectives of risk-based modeling are to identify problematic risks before they are written and create price competitiveness based on the right price for each type of risk.

Risk-based modeling is a systematic approach that involves the entire business, from underwriting through claims, underpinned by large quantities of data that has been amassed and validated. The P&C industry has already seen the financial success that leading personal auto carriers have achieved by using databases of auto and driver characteristics to expand the number of rating classifications to more accurately reflect the level of risk. For insurers offering homeowners coverage, the data used in risk-based modeling to develop myriad pricing tiers is objective, and includes observable building characteristics and claims experience.

Leading-edge risk-based modeling that leverages these concepts depends heavily upon three key concepts:

1. The model must focus on profit and loss captured on an individual property basis. This supports the insurance principle that premiums and losses are based upon specific properties.

2. Model results are clearer and insights are better when each different type of peril is studied separately. For example, home characteristics that contribute to the likelihood of a fire loss do not necessarily factor into the likelihood of a wind loss.

3. Modeling requires enough data to perform statistically sound analyses. Not many carriers archive enough data to meet this criterion. The model's strength, however, depends not on how many policies an insurer has, but rather on the number of policies for which the insurer has consistent, validated data.

Essentially, predictive models create insights into the claims implications of accepting certain risks. The end result is the ability for carriers to improve margins by pursuing more favorable risks and restructuring pricing on loss-prone properties.


Companies in parallel insurance markets that leverage data in the way described in this article have experienced growth rates of 17% per year for more than 10 years. The knowledge, data and technology needed to carry this performance level into the property insurance arena has matured and is already speeding homeowners carriers to the head of the pack. It is only a matter of time before the profitability gap between carriers that aggressively implement a data strategy including acquisition, validation, and prediction cannot be closed by those who do not.

Anthony Hetchler, director, claims analytics; Jonathan Kost, senior VP; Karen Linnell, assistant VP, residential solutions, and Micheline "Mike" Szocs, director of strategic initiatives, are employed by Marshall & Swift/Boeckh LLC, New Berlin, Wis.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access