What's Needed? An Olympian Effort

Before they landed triple Axels, they learned how to balance on a blade and skate across the ice. Before they attacked moguls at blazing speeds, they cruised down relatively flat planes. Before they maneuvered a puck all the way across the rink while weaving through opponents from the other team, they learned how to carry a stick while skating across the ice.Sure, much of what viewers saw at the recent Olympic Games in Torino, Italy, involved the awe-inspiring skills of elite athletes. All of these competitors, however, mastered the basics first. Many other athletes-think weekend warriors-never move much beyond these fundamentals.

Although the work of insurance companies is not likely to be met with the fanfare of an international sporting competition, some of the dynamics are the same. Insurance companies that are moving ahead with strategic data initiatives involving the use of sophisticated technologies all started by mastering the basics. And, these companies remain in an elite class because the majority of insurers still struggle with data fundamentals.

Just about all insurance companies, however, are at least acknowledging they need to start using data more strategically. More specifically, insurers need to share data enterprisewide and analyze it quickly to make better decisions in response to market changes and competitive pressures.

What's needed? Nothing short of an Olympian effort. First, insurers need to master the basics-no small task considering the industry's long-standing struggle with all things data. Once the fundamentals are under control, insurers then need to move beyond the pack by deploying technology tools that will help them turn data into a strategic asset.

An examination of the data challenges inherent in the insurance industry-as well as finding out how companies can master data basics and how some leaders are implementing advanced tools to use data strategically-can help insurers map out a data management plan that can turn runners-up into medal winners.

Out of shape

To start, insurers need to understand how they've fallen behind the data quality curve. But insurance executives don't need to beat themselves up, because many other industries are in the same boat, says Thomas Redman, president of Navesink Consulting Group, Little Silver, N.J. "In industry after industry, data is just awful," Redman says.

There are, however, a couple of factors contributing to the insurance industry's data problems.

First, many insurance companies are dealing with legacy computer systems that work well from a process standpoint but provide plenty of headaches when it comes to data sharing and integration, says Jill Dyche, partner and co-founder of Baseline Consulting, a data management consulting group based in Los Angeles.

"By and large, [insurers] are victims of old legacy systems that really make them less nimble," Dyche says.

As a result, many carriers often are left holding only "partial information," which, according to Dyche, can be worse than having no information at all. In addition, many insurance companies have poor-quality data-or "dirty data"-due to less-than-optimal data collection practices.

Compounding the data integration and access problem at insurance companies is the fact that many insurers have grown through acquisition, says Matthew Josefowicz, manager of the insurance group at Celent LLC, a New York-based financial services research and consulting company.

"Because they have grown through acquisition, many insurance companies run their various lines as independent businesses. Because there were systems in place that worked when businesses were acquired, there really was no immediate motivation to change," Josefowicz says.

As a result, data is stuck in silos, either stored in various disparate information systems with incompatible formats or on paper, he adds. And many insurers run into trouble when trying to get data out of the systems-and into even harder brick walls when trying to integrate data from disparate systems. "It can be virtually impossible to get the needed data. In fact, it's not unusual for insurance companies not to know basic things such as how many customers they have."

Not surprisingly, then, insurance companies need to undergo some basic training before they even attempt to implement advanced data strategies, says Navesink's Redman. "It would be really silly for insurance companies to say 'How can I use all this bad data?'" he says.

Instead, insurers need to make sure that they can produce high-quality, reliable data.

Carriers should adopt automated data acquisition and cleansing programs to ensure their systems are only populated with the reliable data, says Baseline's Dyche. And Navesink's Redman advises insurers to concentrate on the acquisition efforts, not the data clean-up process.

Place responsibility on business owners

"The problem with concentrating too much on data cleansing is that you don't get to the root cause of the problem. If you look at how data is acquired, you can help prevent errors," says Redman.

In addition, Redman suggests that insurers place responsibility for data quality on the business owners. While many companies traditionally put responsibility for data quality on the IT group, it makes more sense to vest the business owners with this responsibility, he says, because ultimately the person who is using the data knows it best.

Having clean data is just the warm up for strategic data initiatives, however. Insurance companies also need to make sure that detailed data from various systems is integrated into one central location such as a data warehouse, says Baseline's Dyche. "You need to have really granular, detailed information from multiple locations integrated into one place," she says.

"Many insurance companies spend a lot of money coming up with corporate strategies-such as getting closer to your customer or increasing profitability with existing customers. But you can't accomplish those goals without having integrated information from disparate sources in one central location," she says.

Surely, more insurance executives are beginning to realize that corporate initiatives actually hinge on the development of data as a strategic asset.

In fact, the importance of data as a strategic asset is becoming more recognized across the industry, according to recent research from Celent.

Based on the responses of 26 senior insurance IT executives who provided information about priorities, behaviors, initiatives and infrastructures at their companies, the research shows rising interest in data master projects. For instance, 74% of respondents are currently implementing a data warehouse, 65% are installing business intelligence/reporting and 57% are investing in operational data stores (see "Data Mastery Priorities, page 20).

Although companies are turning to a number of different technologies, the primary goal of all of these tools is to standardize, aggregate and analyze data, Celent's Josefowicz says. "All these technologies give you a better picture of what is going on in the enterprise, so you can manage more effectively and have a better idea of actual risk," he says.

Proof in the final score

Great American Financial Resources Inc. (GAFRI), a Cincinnati-based insurer, is one company that understands the importance of mastering the basics before moving on to more advanced data initiatives.

Like many other carriers, GAFRI has grown through acquisition. Company executives have realized, however, that to make the most of this growth, data cannot remain in separate systems. Instead, GAFRI moves any newly acquired block of business to its single-platform administration system, LifePro from PDMA, Indianapolis.

When GAFRI acquired a book of business that required the conversion of 30,000 fixed annuities, the insurer decided to use the Data Conversion Architect from Universal Conversion Technologies, Addison, Texas. The Data Conversion Architect is an Internet-based tool that enables users to complete the entire data-integration process from their desktop computers.

"Given our strategy for growth, data conversion is an ongoing effort for us rather than a one-time event," says Dave Wilcox, vice president of strategic systems at GAFRI.

"We didn't want the data conversion effort to cause our staff to interrupt other IT projects that were already in progress and take us off track. So we made a decision to identify a partner who had a core competency in insurance data conversions to provide us a conversion bridge," he says.

With this fundamental nailed, GAFRI can move on to more strategic data projects. For example, the company recently launched Seeking Excellence through Enterprise Knowledge (SEEK). Through this program, GAFRI will populate a data warehouse with information from all systems, making it possible to quickly and easily answer business questions, says Joe Leadingham, assistant vice president of applications development, at the company.

"By using multiple sources to populate the system, we are pulling together a data warehouse that will give us the knowledge to answer all kinds of business questions," he says.

For example, the enterprise database will enable GAFRI to answer a myriad of questions, because it will contain information that provides a complete view of its customers.

The company is taking an incremental approach with this strategic initiative. "We are building the database in iterative steps," Leadingham says. The first phase of the project is focusing on building an agent scorecard, a tool that will help GAFRI assess agent performance.

Demand analysis

To build this tool, GAFRI executives and staff members are deciding what questions they need answered regarding agents, then identifying the exact data points that will help answer those queries. This data will then be loaded from the core systems to the data warehouse.

"The whole concept is based on demand analysis," Leadingham says. "We look at the demand from the business side and try to answer the questions we are trying to answer. We then scope the core systems to find the data elements we need."

GAFRI plans to continue to build various business intelligence tools using this same method.

"This is not a project, but an ongoing mode of operation or way of life," says Leadingham. "We plan to continually look at different issues and build the knowledge into the data warehouse to help us act more strategically."

Although GAFRI is just starting to use data more strategically, some leading insurers have already reaped the benefits of data as a strategic asset.

Bringing home gold

Consider recent computer initiatives at Liberty Mutual Insurance Co., Boston.

Over the past few years, workers' compensation medical costs have risen, growing at a much faster clip than inflation. At the same time, medical treatment costs eclipsed wage payments, becoming the biggest slice of the worker's compensation pie.

And, in 2004, the amount of dollars spent on workers' compensation was greater than the money spent on indemnity, according to Jeff Kramer, national market workers compensation claims operations manager at Liberty Mutual.

Faced with these market pressures, the insurer needed to do something.

"Various events in the market set off a lot of bells and whistles," Kramer says. "We realized we needed to start controlling costs, and we needed to start focusing on medical costs and data. We needed to dig in and know where the dollars were being spent."

The company had a lot of historical data from multiple customers in multiple states. "We needed to take that data and start using it in a new way," adds Kramer. "We needed to identify and analyze where the dollars were being spent."

To get more out of the data, Liberty Mutual teamed up with Thomson Medstat, Ann Arbor, Mich. Medstat provides market intelligence and benchmark databases, decision support solutions, and research services for managing the cost and quality of healthcare.

Liberty Mutual has worked with Thomson to customize the Medstat tool, which was designed for health maintenance organizations (HMOs) to fit the needs of workers' compensation insurers.

To get more control over medical costs, the customized tool uses advanced analytic rules and a decision tree engine to get a better handle on medical costs.

The tool is helping Liberty deliver the best care at the best price.

"We are using the tool to identify the providers that give the best results for our customers. We want to make sure our members can get treatment at the proper time and still pay a fair and reasonable amount to the provider," Kramer says.

In addition, the advanced analysis of data has produced several other benefits, which include:

  • Better networks. Only those providers who have demonstrated cost-effective care are included in the medical care networks available to policyholders in states where employers direct the care of injured workers.
  • Lower treatment costs. Even when Liberty Mutual is working in states where employers can't direct care, the data is producing the intelligence required to negotiate with individual hospitals and doctors, showing them how they compare with the most cost-efficient providers. For example, the data tool can identify the average charge for certain services. "If the average is $120 for an office visit, for example, we know that we have to negotiate a better rate if a provider is charging $300," Kramer says.
  • Better care and faster return of workers. Uncovering the most effective medical care for specific injuries enables Liberty Mutual to share best practices with medical providers, thereby providing the care that brings about the best outcome for insured workers and helps them quickly return to work.

Dutch insurer FBTO Verzekeringen is another company that is ahead of the data curve. FBTO has turned to predictive analytics to more strategically market its products.
FBTO formerly marketed its products through mass mailings. However, because marketing campaigns were not targeted, there was a relatively low conversion rate.

As a result, FBTO wanted to use data intelligence to model customer behavior, predict customer needs, anticipate customer reactions to special offers and then use these insights to make the marketing program more effective.

To accomplish these goals, FBTO turned to PredictiveMarketing, a predictive analytics tool from SPSS, Chicago. Predictive analytics is the branch of data mining concerned with the foreseeing future probabilities and trends.

The central element of predictive analytics is the predictor, a variable that indicates an individual's likely future behavior. The SPSS tool, for example, enables FBTO to identify customers or prospects with a better-than-average propensity to purchase products.

Using this predictive analytics tool, FBTO has increased its sales conversion rate by more than 40% and decreased direct mail costs by 35%.

Results like these simply prove how valuable it is to start to use data strategically, Navesink's Redman says. "It really is a cool time to be a data professional in the insurance industry. The opportunity to exploit data to be more competitive is there like it never was before," he says.

Identifying Good Data

It's not as easy as squeezing a melon at the supermarket, but it is possible to tell the difference between good and bad data.Quality data has the following attributes:

It's accurate. The information in the database really represents what it is supposed to capture.

It's valid. The value for a data element in the database is an allowable value.

It's reasonable. The data is consistent with prior data or other available information.

It's complete. Every recorded transaction contains all the necessary information, and all pertinent transactions are being recorded and reported.

It's timely. Transactions are consistently recorded and processed within established time frames.

Source: "Data Quality: Another Road to Insurer Competitive Advantage," a report from GE Insurance Solutions, Kansas City, Mo.

Dirty Data and Cold Cash

Dirty data is not just a nuisance, it also costs companies money, according to the Data Warehousing Institute, Seattle.

Consider the following:

Poor-quality customer data costs U.S. businesses a staggering $611 billion per year in postage, printing and staff overhead.

With an error rate of only .001, an insurance company that receives 2 million claims per month with 377 data elements per claim will commit 754,000 claims data errors per month and more than 9.04 million per year.

If it costs the insurer $10 per error to cover the staff time to fix the error, then the company's exposure to poor-quality claims data is $10 million per year.

In addition, poor data quality can result in overpricing, underpricing, writing too many bad risks, regulatory fines, and losing business from disgruntled customers, according to GE Insurance Solutions, Kansas City, Mo., in a report titled "Data Quality: Another Road to Insurer Competitive Advantage."

John McCormack is a freelance writer based in Riverside, Ill.

For reprint and licensing requests for this article, click here.
Analytics Policy adminstration Data and information management
MORE FROM DIGITAL INSURANCE