INN's Annual Top 5 Trends for 2012 - Analytics
Of all the trends set to impact the insurance industry in 2012, the wide use of business analytics across the enterprise was likely the easiest to foresee.
As readers of these pages undoubtedly know, insurers across the industry have spent many years and much lucre building out data warehouses and investing in business intelligence and predictive analytic platforms. Martina Conlon, a principal in insurance practice at Novarica, says with organic growth strategies and expense reduction among the top business priorities for insurers, effective use of analytics is becoming table stakes across the industry. "You are at a competitive disadvantage if you are not leveraging your data," she says.
What's more, the technological bar is rising. As many of the once-novel analytic functions carriers perform (having access to enterprise information, incorporating external data, dashboards, scoring, et al) become routine, the way they utilize predictive models to augment processes and products will become the new source of competitive advantage. "There will be a big focus on predictive models the next few years," she says. "Most carriers have a good foundation to use for analysis, so the next step is developing the predictive models and implementing them so they can be pulled from core processes."
The endeavor to move analytics from back office to front office has only recently become feasible in light of a constant stream of advances at the platform and application level. Indeed, technology vendors have placed a greater emphasis on integrating analytic algorithms into existing work flows and solutions, offering simpler, if not "out-of-the-box" functionality.
"Analytics is becoming much less a stand-alone activity; it has to be embedded in business processes," says Keith Peterson, VP of Advanced Analytics for Mitchell International. "It has to integrate fully into core products."
One of the ancillary benefits of this, Peterson says, is that business users across the enterprise will begin to become more quantitatively oriented. "A lot of the data prep and initial model-building can be done by a business analyst rather than a highly-trained actuary. But you will still need people with a statistical background to determine what's important and how to extract the data from the system and put it into a format that business users can leverage."
Moreover, this broader use of analytics at the business level increases the chance of data from disparate business units cross-pollinating to positive effect. For example, affording underwriters a better understanding of what happens at the end of a claim.
Peterson also foresees greater use of optimization-based algorithms for the purposes of benchmarking and experimentation. For example, a claims department could take cross-industry data and monitor it over time to better understand what is happening with its own claims performance. It could also test different scenarios of say, adjuster assignment, to determine which staffing level is optimal for a given geography or line of business before making key personnel decisions. "There's enough data out there now to establish a baseline for your organization and monitor it over time, and as you try new tactical and strategic changes you can actually assess and measure the impact in a pretty granular way," he says. "This way you know very quickly whether a change is adding value."
In addition to getting broader, use of data may also soon be getting deeper as insurers inch toward the Big Data era. Indeed, a variety of data streams from sensor data to information culled from mobile devices and social media sites promises a deluge of data insurers can leverage for a variety of purposes. One consequence resulting from the move into Big Data will be a renewed focus on metrics around data quality and suitability, adds Stuart Rose, global insurance marketing manager at SAS Institute. "Insurers need to find out how much of the information out there is white noise and how much is really relevant for them," he says.
Another question then becomes whether insurers have the proper systems and architecture in place to take advantage of Big Data. Given the potential size of data sets insurers may need to store and analyze, Rose isn't entirely convinced. "It's not so much lack of understanding of analytics but a lack of infrastructure," he says.
David Smith, VP, insurance solutions, iGATE Patni, a consulting, technology and business process outsourcing firm, says the data provisioning models currently employed by many carriers may become unsupportable in the Big Data era as data sets expand by orders of magnitude. "The traditional ways of storing, managing, processing and analyzing data are no longer adequate," he says. "The stove-piped approach of providing each application with its own servers, storage, and sometimes its own networks, is inherently wasteful."
Speaking at the CIO Insurance Summit in October, Eric Bulis, EVP of operations and CIO for SBLI USA Mutual Life Insurance Company, agreed that carriers have much work to do in preparation for the Big Data era. On the infrastructure side, Bulis said he has begun investigating new technologies, such as those based on the open-source software framework Apache Hadoop, in order to prepare for larger amounts of data. "Right now we're still dealing with terabytes of data," he said. "We are not yet dealing with petabytes but we have to plan for when we get there."
Bulis said it was vital to impart the importance of advanced analytics to upper management in order to ensure sufficient resources are at hand. "On the life side, innovation can be difficult because there is an entrenched status quo," he said. "But from a marketplace perspective we are going to need Big Data to help us innovate."