Data Management in the Rapidly Changing Insurance Domain

Data not only drives the core processes in financial services and insurance, it is also the byproduct of the process. The breadth and depth of data is rapidly increasing as the number of external (public and private) sources and the granularity of data are growing. New technologies allow for improved access to data and expanded capabilities of analytical tools. In turn, all types of financial service providers, especially insurers, are making increased use of this data, technology and analytics to gain competitive advantages.
 
Also increasing is the number of data “users,” including insurers, regulators and consumers. In addition, user expectations for instant access to data and technologies are increasing. The data is being used by these diverse constituencies for a variety of reasons, including risk selection, rate regulation, rate and price analysis, enterprise risk management, solvency evaluation, marketing, product development, market conduct monitoring, fraud detection and prevention, loss control and product comparison and product selections.
 
A Different Perspective
 
As the focus of insurance information shifts, so does the impact on data and information management and quality. Enterprisewide data impact for insurers can best be examined from four different perspectives—regulation, data analysis, technology and globalization.
 
Data management has its roots in accounting. Within a company, it exists in general ledgers and accounts receivable, and externally, in regulatory accounting requirements. Because of this, the information shift has been the most dramatic in the regulatory reporting arena. As data availability and access have increased, so has the regulation of this data. Annual statements were supplemented by market-conduct annual statements, which led to National Association of Insurance Commissioners (NAIC) and state databases. Financial data was used to monitor solvency. Now, financial data, statistical data and analytics are also used to monitor enterprise risk. And U.S.-driven regulations have given way to internationally driven regulations with increased emphasis on solvency and privacy.
 
The data management implications of the regulatory shift include the need for data transparency; support for internal controls; the promotion of clear, standardized, comparable information; and increased emphasis on privacy, confidentiality and compliance.
 

While not as dramatic as those in the regulatory arena, information shifts in analytics have had the greatest impact on insurance products and services. Just think about how policies were administered and claims settled in the 1900s and 2000s. Traditional underwriting and pricing using traditional data sources (risk data and industry statistics) have given way to predictive modeling and analytics that use nontraditional data sources, such as demographics, geographic information systems, third-party data, noninsurance data and nonverifiable data. There have also been shifts from risk-specific management to enterprise risk management, from a stable risk control and claims environment to a dynamic environment of new hazards (mold, terrorism, computer viruses, cyber terrorism), and from traditional actuarial pricing methodologies to predictive models—most notably, catastrophe models. What’s more, noninsurance specific data, such as credit scores, insured occupation and household data, is now being used for pricing and underwriting.
 
There are many associated data management issues, including an increased focus on data completeness, transparency and accuracy; managing new, different and more granular data; and reducing the cost and time associated with data collection, storage and dispersal. Information is being made available more quickly, and there is a need to promote the interoperability of data and databases. This allows for better integration, thereby giving users more options for how data can be used.
 
Data content and definition must be closely managed throughout the organization. It should promote internal and external consistency across business units and time, ensuring enterprise data quality and communication among various sources. The issues associated with repurposing data—contractual, regulatory, technological, data quality and mapping across disparate sources - must be recognized, while data gaps and significant differences must be documented.
 
Changes in data go hand in hand with changes in technology, as technology enables data access and availability. Recent examples include movement from centralized, highly controlled technologies to application service providers (ASPs), the Internet, XML, local area networks (LANs) and personal computers (PCs). There have also been shifts from technology as a business enabler to technology as a business driver, from mainframes to LANs and high-powered PCs, and from data collection to extract, transform and load data processes. Risk managers are also embracing new technologies, such as handhelds, voice over Internet protocol, smart phones, global positioning systems, black boxes, radio frequency identification and weather data.
 
The impact of these technological shifts on data and data management is tremendous. Information is now being managed over many moving and continuous data points, as opposed to fixed points in time. This dynamic evolution in data management ensures the quality of new types of data as well as how to use and store them. Data managers must assess the need for “trigger points” to protect information from inappropriate use and to balance the need for more granular data with the cost and time associated with data collection, storage and dispersal. Both structured and unstructured data need to be managed, with the interoperability of data and databases a key component.
 
Globalization is also having a major impact on the data environment. Outsourcing not only affects IT and business functions, but data management as well. Expanding business beyond U.S. borders, the need to educate foreign staff about U.S. issues (and vice versa) and—most significantly—understanding and respecting cultural differences all play a part.
 
From the global perspective, there is a need for expanding the data quality focus to recognize cultural differences. Procedural manuals, edit packages, data dictionaries, schema and implementation guides must recognize differences in terminologies, languages and definitions. We must also recognize cross-border transparency and increase our emphasis on compliance with international rating, reporting laws and solvency regulations.
 
Meeting Data Management Challenges
 
So how do data managers meet these seemingly overwhelming changes?
 
At the organizational level, data managers must promote data stewardship and enterprise data governance, which includes strategic planning and information. It is critical that data strategies align with enterprise and business unit plans.
 
Also at the enterprise level, controls and measures must be established, both for internal and external (third-party) data sources. Some important measures are timeliness, completeness and accuracy, with a focus on a new criterion: verifiability.
 
At the functional level, data managers must take inventory and document data sources. Documentation should include metadata (information that describes the content, quality, condition, origin and other characteristics of data), mapping criteria, data quality and completeness measures, confidentiality and security constraints, and data use and reuse criteria.
 
Data managers must also use a number of traditional and new tools. On the traditional side are data models, data and process flows, data mapping documentation, data standards, international naming conventions, data transformation and generation, detailed specifications, regulatory monitoring, data security controls and audits, as well as versioning.
 
New tools and techniques include the use of metadata; increased use of scientific measures; knowledge management (a process within an organization that ensures its intellectual capabilities are shared, maintained and documented); text mining; unstructured data; data transparency through documentation, measures and controls; and a relatively new concept: master data management (MDM).
 
MDM is a set of processes to create and maintain a single view of reference data that is shared across systems. It is used to classify and define transactional data through a centralized integration manager. MDM leverages policies and procedures for access, update and overall management of this central resource and its coordination with other participating systems across the enterprise.
 
MDM first took root with customer data integration. It initially included the management of customer reference data and product information management, which involves the management of product and supplier reference data. The insurance industry has since expanded the applications of MDM to areas including managing account, policy and claims data.
 
The Evolution of Data Management
 
Change in today’s data management world is not limited to types of available data or data management tools and techniques. Instead, it is encapsulated within the data management function itself. To survive in this rapidly changing environment, data managers must embrace these changes and actively seek out new frontiers.
 
Data managers must also evolve their status and position within the organization—becoming familiar with business strategies while offering an enterprise data view where appropriate. They must be well versed in enterprise data issues and needs, not only managing current data resources but also seeking out and evaluating new sources.
 
Thomas C. Redman elevates the professional status of data managers even further by discussing the need for a chief data officer. This position is responsible for applying the corporate data strategies and policies defined by corporate data stewards, leading the data quality program, the application of corporate data strategies and policies to data suppliers, and owning and housing the metadata process.
 
The amount of data in the world of financial services has been growing exponentially over the years, and the insurance industry—whose foundation is data—is being greatly impacted. Those who want to succeed can no longer stay on the sidelines and watch as opportunity passes by. To succeed, insurers, like most business enterprises, must learn to ride the wave.
 
Reference:
 
Thomas Redman. “The Body Has a Heart and Soul – Roles and Responsibilities of the Chief Data Officer.” Information and Data Quality Newsletter, January 2007.

Peter Marotta is enterprise data administrator and principal for Insurance Services Office Inc.

Reprinted with permission from DM Review

For reprint and licensing requests for this article, click here.
Analytics Core systems Policy adminstration Data and information management
MORE FROM DIGITAL INSURANCE