If business analytic solutions are ever to become fully realized within the enterprise, the technology must move away from being an exclusive province for mathematically adept power users to a tool palatable and familiar to a variety of end users.
Yet many technical, operational and cultural hurdles need to be cleared for this to happen.
On the technical side, much progress has been made crafting customizable front ends and dashboards that simplify presentation to the end user. However, one nettlesome problem that has only recently begun to be addressed is that of latency.
Slow query response has been the bane of many business intelligence initiatives. If business users have to wait to gain analytical insights to inform their decisions, they may ignore the tool and act on gut instinct alone or pass the querying work along to overwhelmed power users. The latency issue is particularly acute if a carrier seeks to transition to employing true predictive analytic solutions that provide actionable information as opposed to more traditional, backward-looking business intelligence applications.
HARDWARE AND SOFTWARE
When pondering the continuing advancement of computer-enabled calculations, it is hard not to think of Intel co-founder Gordon Moore. In 1965, Moore famously postulated that due to miniaturization, the number of transistors on an integrated circuit would double every two years. In the four decades since, the computer industry, through some stupendous feats of engineering, has managed to keep apace of Moore's Law and the latest generation of processors from Intel feature a billion transistors only 32 nanometers (billionths of a meter) wide. The density of hard drives has followed a similar, exponential growth curve over the past few decades when plotted against cost per unit.
The price of another key computer component RAM (random access memory) has also fallen precipitously over time. While five years ago 1 GB of DDR RAM cost roughly $150, today 1 GB of faster, more energy efficient DDR3 server memory can be had for as little as $30.
Dovetailing nicely with the availability of faster, cheaper RAM is the increasing prevalence of 64-bit operating systems in the past few years. While 32-bit operating systems were largely limited to accessing 4GB of memory, 64-bit systems can access much more. The 64-bit Windows 7 Enterprise edition can use 192 GB of memory while Windows Server 2008 R2 Datacenter version can hold an astounding 2TB.
IN-MEMORY ANALYTICS
Although there is historically a lag between the availability of new hardware and software and its widespread use in data centers, the sum of these shifts is beginning to impact the range of options for insurers looking to leverage analytics.
One of the most promising manifestation of this is the rise of "in-memory" analytic solutions. In-memory analytics leverages the power of very large memory caches and multi-core architectures. Deployed in a purpose-built grid it can perform analytic computations such as stochastic modeling faster by eliminating the need to access data on slower, mechanical hard drives. By way of comparison, hard drive latency is measured in milliseconds while RAM latency is measured in nanoseconds-a difference in speed of six orders of magnitude (1 millisecond is equivalent to 1,000,000 nanoseconds). When considering large data sets, the difference between milliseconds and nanoseconds is no trifling distinction.
With an in-memory platform, a carrier can pull massive amounts of data from a data warehouse into memory, perform analytics there and store it in memory for other related computations, says David Wallace, Global Financial Services Marketing Manager at Cary, N.C.-based SAS Institute Inc. "We see great promise for this technology because it enables a new class of applications and a new way of providing answers that heretofore were not easy to address," Wallace says.
One such area where fast answers are at a premium is in capital management. In the wake of the economic crisis, with market volatility and threats such as sudden contraction of liquidity now omnipresent, calculations once performed quarterly or monthly may now be required to run daily. Indeed, risk managers desire the ability to instantly evaluate the liquidity of the assets in a given portfolio on a daily or even intraday basis. They also want the ability to model tail risk better and determine value at risk (VaR) by accounting for myriad fluctuations in exchange rates, interest rates, cash flows and counterparty exposures. "Determining VaR is a very iterative process," Wallace says. "Once you calculate VaR, you need to run different scenarios that stress (highlight?) that portfolios and stress testing is easier because everything is still in memory. The end result is that you have an opportunity to manage your risk exposure much better with a finer grain and make fact-based decisions quicker."
To quantify the speed advantage engendered by the in-memory concept, Wallace says SAS ran a portfolio of 45,000 different financial instruments, including bonds and foreign exchange contracts, against 100,000 market states and two time horizons, some 8.8 billion individual calculations in all. "Initially it took 18 hours to run those calculations on servers," he says. "We were able to do that same job in-memory in 3 minutes. "
Rodney Nelsestuen, senior research director at Needham, Mass., consultancy firm TowerGroup, says in-memory technology works best at any point where decisions are time-sensitive, such as any decision that involves a customer interaction. "If you look at fraud, you can run better analytics on a real-time basis with more data," he says. "This could help sniff out false positives."
THE TECHNICAL PREREQUISITES
Nelsestuen cautions that not every analytic calculation is going to benefit from being done in memory. "I don't think that everybody is going to move everything to in-memory analytics," he says. "There's still a need to aggregate data and it may be much more than you can carry in memory."
To be sure, despite receding hardware prices, the entry bar for in-memory analytics bears some consideration. "You have to upgrade your fundamental technologies because you need enough RAM to do this," Nelsestuen says.
Wallace agrees that one way to leverage in-memory analytics is with a dedicated grid computing environment built specifically for the task. He views the technology as complementary to in-database and traditional grid-based analytic approaches many insurers are currently deploying.
Yet, considering the fierce competition for IT investment dollars, securing the funding to deploy an in-memory platform may be a challenge. "Going to a CFO and defining the value proposition is still a tall order for IT professionals," Nelsestuen says. "They need to put this in terms that senior management can understand and buy into."
Accordingly, Nelsestuen says buy-in from the business is critical. Wallace adds that feedback from customers who wanted answers quicker was a significant factor in the development of the in-memory platform at SAS.
Nelsestuen notes that after the prolonged belt-tightening in IT budgets, the realities of the competitive environment may spur many to invest in the infrastructure necessary to employ in-memory technologies. "People haven't looked at hardware as strategically in the past few years as they probably should have," he says.
THE MATH
While in-memory analytics may well address the latency issue it also lays bare another problem-the accuracy and quality of the analysis performed. The performance enhancements enabled by in-memory technologies are inconsequential if the underlying analytics and the mathematical equations that enable them are not robust enough.
Few understand the importance of pure mathematics to business analytics better than Myron Scholes. In the early seventies, Scholes helped create the Black-Scholes equation that is the fundamental conceptual framework for one of the thorniest problems in financial services, determining the value of derivatives. In 1997 Scholes was awarded the Nobel Memorial Prize in Economic Sciences for devising the model.
Scholes, who holds the position of Frank E. Buck Professor of Finance Emeritus at Stanford University, says computer scientists and mathematicians need to work together to create efficient solutions to data-intensive problems. Scholes cautions against a "more is better" approach to problem solving. "The biggest quest now is to have efficient mathematical algorithms to be able to solve computationally very dense problems," says Scholes. "Trying to find brute force solutions is very time-consuming, even with computers."
He says the challenge surrounding analytics is akin to Internet search. While more servers may give you more answers faster, they don't necessarily ensure that the answers are more applicable to the question being asked. "Internet search is still brutish; it uses pseudo-algorithms,' he says. "Mathematically, it's still not the most efficient solution to devise. With better mathematics you won't have to keep increasing the numbers of servers or the speed of the computer in order to find the answer."
Scholes says better algorithms will only come from better mathematicians, noting that nurturing that human capital is as important as utilizing faster computers.
"We're realizing that the problems we now have to solve are bigger than the computers can now handle," he says. "So, let's take a different approach to solving problems as opposed to letting the computers do all the work. There's a need for pure mathematicians to build and understand mathematical models. Also, operations research people, who think about how to structure answers to problems in efficient ways, are the wave of the future. Our ability to think up problems far exceeds Moore's Law."
THE CHALLENGE
In addition to academia, insurers also need to reassess the role of human insight in analytics. The confusion surrounding the role of analytic models in risk management regimes of banks and insurers during the economic crisis may be sufficient impetus for introspection.
"What the financial crisis revealed was that there was a lack of imagination about the range of probabilities included in the analysis," says Donald Light, senior analyst at Boston-based Celent. "It's not a problem with tools, it's a problem of how people use them for risk management."
Indeed, some argue that a cultural change-where business users across the enterprise acquire a visceral understanding of the data-is needed for insurers to full exploit analytics. John Lucker, principal at New York-based Deloitte Consulting, LLC says there needs to be a defined human resources process around hiring the correct mixture of people. Lucker says that in addition to people adept with the mathematical underpinning of analytics, carriers need people skilled at technology integration and also change management experts that can ease analytic processes into the existing culture of the company. "One thing that is widely lacking is a business analytics strategy," he says. "Typically business analytics is used as a tool, but I think companies need an end-to-end business strategy. There's a huge gap between buying tools and executing on a business strategy."
SIDEBAR
Health Care Reform and Analytics
Much as technology is shifting the speed of analytics adoption, health care reform (HCR) regulation may also hasten it.
"HCR has created board-level discussions about projects that were once stalled," says Shawn Jenkins, CEO of Charleston, S.C.-based Benefitfocus, noting that in August, the company acquired Benefit Informatics, Inc., which provides data analytics technology to health plans.
Jenkins says the prospect of health insurers selling policies over government sponsored exchanges has increased interest in analytics. "When your online and it's your logo against somebody else's and your trying to win customers by guiding them through your Web site, that's a different game then most insurers are used to playing," he says.
Indeed, while insurers have long leveraged analytics to optimize internal functions, increasingly they are using it in customer-facing applications. For example, a carrier could employ analytics to afford consumers a more interactive way to view data from differing policies online. Insurers could learn much from industries such as retail, which employ analytics gleaned from individual customer preferences to determine what the customer sees onscreen. "Insurers are now realizing that they have a wildly rich set of data,' says Jenkins. "Compared to your health plan, WalMart doesn't know anything about you. Once you get basic technology in place to warehouse and analyze data, you created quite a sandbox."







