Much like youth, good looks or money, when it comes to raw computing power, too much is never enough.

This is especially true for insurance actuaries, who are tasked with some of the most horsepower-hungry tasks in the enterprise-pricing, asset and liability analysis and estimating reserve requirements.

For the latter, actuaries will use stochastic models to estimate the probability distributions of potential outcomes of a certain event, say, for example, the chance a life insurance claim will be paid in a certain quarter. Thus the prominence of multi-day and multi-week modeling runs. Yet, these runs often bump against time constraints. An actuary with a 30-day window to develop reserve calculations may be in trouble if an application crashes, or if he makes a mistake in the model. The small window afforded to re-run the calculation may prove insufficient, especially on a shared system.

Even a seemingly simple simulation can require a surprisingly large number of calculations. For example, if one wanted to calculate the reserves for 30,000 life policies over 30 years, on a quarterly basis using 1,000 scenarios, the total number of outcomes would be 3.6 billion. Done sequentially, allowing one second per calculation, such a problem would take 11 years to complete.

THE LIMITS

Surprisingly, given its computational needs, actuarial software has been more at home on the desktop than in the data center. Although done on linked high-end desktops and workstations, these solutions often struggle to keep up with modeling demands. Two unpalatable choices presented themselves to actuaries: You could either cut the amount of data by running a subset of the number of policies you had originally intended, or you could get more expensive hardware.

Hoping to avoid such a choice, insurers are increasingly investigating distributed and grid computing options as a way to give their actuaries access to greater computing capacity, and to leverage existing processing power in their data centers.

"A lot of the capabilities the actuaries wanted exceeded the abilities of what we had on the desktop," says Chuck Mahon, manager of server management at Cincinnati-based Western and Southern Financial Group. Rather than continuing to add workstations, the company began to look for ways to leverage their data center for actuarial calculations. "Traditionally, actuaries didn't touch the data center much," he says.

With an ever-increasing number of computational scenarios related to variable annuities prompting it to upgrade, Western and Southern began to investigate Windows computer cluster server platform from Redmond, Wash.-based Microsoft Corp., which would enable it to perform actuarial runs on servers. A proof of concept team consisting of infrastructure, application programmers and actuaries from Western and Southern, plus representatives from the company's provider of actuarial software, Seattle-based Milliman Inc., convened in Chicago at a Microsoft Technical Center for a simulation.

Mahon says the simulations in the Microsoft lab were "a home-run experience," that reduced the risk of implementation back in Cincinnati. "We learned how to configure it, and how to rapidly scale it out and deploy it," he says, noting the cluster server platform, since renamed Windows HPC Server, (high-performance computing) provisions against bare model hardware very quickly. "Basically, we learned how to operationalize it."

Now with the product operational, Mahon says jobs that were taking 40 to 60 hours on a group of actuaries' computers are now done in 30 minutes. Moreover, he likes that the product is a utility computing model, enabling actuaries to easily submit jobs to it.

This last feature is no accident, says Jeff Wierer, group manager for Windows HPC Server, Microsoft. When the first HPC product debuted in 2006, it processed jobs in batch mode. Feedback from customers in financial services influenced the development of later versions. As a result, the renamed product is more service-oriented, and sports a Web services interface. "In a batch-oriented world, an actuary would submit a job that would run in a black box, and when it was done, he would get the answer back," he says, noting the new platform's ability to see and manipulate data, while runs are going on. "Now, you can have a spreadsheet open and see results as they come back."

THE PROMISE of PARALLELISM

Many of the problems in insurance are millions of iterations on a similar model, just with different data. Wierer says the merit of the HPC platform is that it takes jobs that were normally done sequentially, breaks them up, and enables them to be worked on simultaneously over many servers. This is an example of parallel computing. In theory, a problem that takes 10 days to run on a single machine or processor can be broken up into component parts and run on 10 machines in a tenth of the time.

"To break up a problem, an insurer can use HPC Server as resource manager or job scheduler to orchestrate the distribution of that problem across the servers," he says.

While the history of parallel processing, like much else in the industry, harkens back to the mainframe, it has only received a great deal of consideration in recent years. Wierer says the concept gained more footing within Microsoft in 2005, when company founder Bill Gates addressed a supercomputing conference and laid out a road map for the company to develop parallel computing software.

Oddly, the renewed interest in parallel computing emanates from decisions made by chipmakers. By the middle of the decade, both major suppliers of microprocessors, Advanced Micro Devices and Intel Corp., after years of continuously raising the clock speed on their processors, switched to dual core designs.

Thus, instead of working on a faster horse, microprocessors would henceforth be a team of horses. While this switch helped the chip makers avoid issues of heat dissipation, it presented software makers with a problem-how to utilize those extra cores. This issue is so vexing that two years ago, Microsoft and Intel funded the creation of two Universal Parallel Computing Research Centers (UPCRC), one at UC Berkeley and another at the University of Illinois at Urbana-Champaign.

One primary reason software that utilizes parallel processing has lagged behind the hardware is that not all problems lend themselves neatly to being solved in parallel. As Werner von Braun famously observed, while one woman can have a baby in nine months, nine women cannot produce a baby in one.

Many of the prominent actuarial solutions available today started off as desktop solutions with limited parallel capability. Thus, vendors of actuarial software had to recode in order to process in parallel.

Wierer notes that actuarial products from Milliman, Towers Perrin and SunGard all now work well with HPC. Even one of the more venerable of actuarial tools, Microsoft Excel, is now optimized to use multiple cores.

"The problem with using desktop computing for parallelization is that you start hitting bottlenecks," says Marc Fakkel, global head of the actuarial business practice group for SunGard. "So we decided to develop a proper enterprise solution, client-server application and re-architected the calculation engine to allow it to be distributed efficiently and make it scalable."

SCALABILITY

One of the most promising aspects of parallel computing is that it offers linear scalability - the more cores or servers attacking a problem, the faster it is solved. "You can scale the heck out of it," Mahon says.

Another factor in favor of parallel computing is price. For very modest layout of capital, distributed solutions can run on commodity data center hardware. "The price point per core has come down substantially, and that has had a major impact from a budgeting perspective," Fakkel says.

Mahon agrees, adding that taking a distributed, server-based approach is low risk, high reward. "Hardware is cheap now," he says. "Dell, HP and IBM are producing great hardware at commodity prices. With these high-density blades, there's often eight processor cores - an amazing amount of processing capability. For the actuarial problems that's what you want."

Moreover, a utility model platform such as Windows HPC Server can be used by others within the enterprise when not being used by the actuaries. A thin-client architecture where applications reside on every machine in a network and data is centralized on server is possible. "Leveraging this for the rest of the enterprise is pretty straightforward," Mahon says.

Eric Webster, VP of marketing for Bloomington, Ill.-based State Farm Insurance, says the company uses such a client-server based architecture for its deployment of business intelligence software.

TOO MUCH

While the scalability of parallel systems may be linear, the question remains whether increased processing power leads directly to better operations. The more computing power that becomes available, the more actuaries tend to do, SunGard's Fakkel says, noting the trend to use evermore complex stochastic models may well nullify increases in speed afforded by parallelism.

"Because we're moving to the multicore and distributed processing route, what we're seeing now is that companies are trying to be more detailed in their modeling," says Tom Hettinger, managing director of actuarial consultancy and software provider EMB America LLC, San Diego, noting that the company's newer offering makes use of distributed processing.

Yet, this doesn't mean Hettinger counsels a brute force approach. Understanding what's driving the results the models spit out is paramount, he says. "Details can only get you so far," he says. "We're still making guesses, and are probably better served going a simpler route"

Fakkel, too, says that in the end, a model is just a model and dependent on human analysis. "If there's one thing in life that is guaranteed, it's that your model will never actually represent the future."

(c) 2009 Insurance Networking News and SourceMedia, Inc. All Rights Reserved.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access