The Forgotten Disaster

While hurricanes are considerate enough to announce their intentions prior to arrival, earthquakes arrive practically unannounced. Although seismologists are able to provide general forecasts about the probability of a seismic event along a given fault line, a system accurate enough to predict a particular event in time to allow an evacuation of an at-risk area remains elusive. "Looking at earthquake is important because it is the forgotten hazard right now," says Andy Castaldi, who is head of the Catastrophe and Perils Division at the Americas Hub for Swiss Re Americas Holdings. "What happens when we have a Katrina-like event for earthquakes?"

Fortunately, America's cities haven't endured a major earthquake since 1994, when the Northridge Earthquake struck California. Even though the quake was a relatively modest 6.7 in magnitude, it was strong enough to collapse freeways and was costly to insurers. According to numbers from the Insurance Information Institute, adjusted for inflation, the Northridge quake caused $20 billion in damage and $12.5 billion in insured losses, ranking only behind Hurricanes Katrina and Andrew as the most costly natural disaster of the past 30 years.

In addition to the losses, the Northridge quake had another major impact on the insurers as many stopped writing coverage in the state.

The reluctance of carriers to expose themselves to earthquake risk is not just limited to California. In 2006, Northbrook, Ill.-based Allstate Corp. announced that it was dropping most earthquake coverage nationwide. One reason insurers may shy away from coverage of the risk is because the possible error bar is so large. The realm of uncertainty around the loss potential is huge, because the power of earthquakes diverges widely. For example, while a magnitude 7 quake releases 31 times the energy as magnitude 6, a magnitude 8 quake releases 1,000 times the energy of a magnitude 6.

Given the wide range of damage possibilities, precise modeling is critical for insurers as an overly conservative estimate of future catastrophe losses may price them out of the market, while an overly optimistic assessment can leave the company vulnerable to big losses should a major event occur.

"Even with Hurricane Katrina, the number of deaths was minor compared to what you'd see if you had a major earthquake in a heavily populated area," notes Alice Gannon, a senior consultant with San Diego-based EMB America LLC.

IN OR OUT

To model, carriers can create their own or, more commonly, rely on models built by firms such as Newark, Calif.-based Risk Management Solutions Inc. (RMS) or Boston-based AIR Worldwide Corp., and often run by reinsurance brokers.

Zurich-based Swiss Re, does the former, and built its own proprietary modeling tool. "Our staff includes scientists and engineers," Castaldi says. "We do some new research, but a lot of our work involves adapting existing research. We take science and transform it into applications."

Castaldi says having a homegrown modeling tool has its advantages. "It's more transparent to us because we built our own tool so we understand why it goes this way versus that way," he says. "We can open up the tool and understand why it works."

Since carriers that utilize outside modelers are not privy to its inner workings of the model, they need to be vigilant that the numbers it is returning are feasible.

"At a minimum, we try to look at the data the same time every year, so we have an idea what the model is likely to say for our 100 and 250-year PMLs based on our known annual rate of exposure growth," says Tom Holtshouse, VP of product management for Bucyrus, Ohio-based Ohio Mutual Insurance Group. "Once in a while, we request our reinsurers run both the RMS and AIR models just to check one against the other. Historically, for us, they've been reasonably close."

So how much trust can a carrier have in its models?

"From our point of view the CAT Modeling programs that predict the Probable Maximum Loss (PML) per event and as an annual aggregate, are somewhat of a 'black box,'" Holtshouse says. "After Katrina, everybody was a bit 'gun shy' to be sure that the models were actually giving the right PMLs."

Yet, paradoxically, the bigger the disaster, the easier it is accurately modeled, says Jim Kennedy, SVP for CAT Modeling for London-based Benfield Group. "Hurricanes and Earthquakes are the perils that modeling firms have the most comfort in because they generally encompass a wide area, so the law of large numbers comes into play," Kennedy says. "If you have to predict a tornado, and it's down to one house, your chances of being right are very small."

WHERE, WHEN and WHAT

The existence of well-known, active fault lines, such as the San Andreas in California, help modelers pinpoint the locations most at risk for earthquake damage. However, other, less active faults present a greater degree of uncertainty. One such fault is the New Madrid fault, which underlies Illinois, Missouri, Arkansas, Kentucky, Tennessee and Mississippi.

"We're always paying attention to a California earthquake, but a New Madrid earthquake is the more frightening scenario from an economic loss level," Castaldi says. "We don't have the information underlying the New Madrid hazard, so it's a very uncertain component of the models."

While the knowledge of fault lines enables modelers to get a somewhat accurate idea of where an earthquake will strike, the when remains elusive. "When the earthquake happens is important," says Kennedy. "The difference in loss potential between 3 a.m. on Sunday and 3 p.m. on Wednesday is huge."

One area where definite strides are being made is the "what." When it comes to damages, what an earthquake strikes is as important as where it strikes. The difference in damage to steel-frame building will vary greatly from one made of, say, unreinforced masonry. Newer models are giving greater credence to not only how a building is constructed, but how it is occupied. Castaldi says that while current earthquake models excel at estimating structural damage, often half of the cost of repairing a damaged building comes from non-structural elements such as heating, cooling and plumbing systems. Future models will do a better job of correlating structural damage and non-structural damage, such as calculating the probability a quake will damage a building's sprinkler system, causing water damage or the probability of fire following a seismic event, he says.

Despite these concerns, Castaldi says the models are very good. "The insurance industry is pretty sharp at understanding our exposures and hazards," he says. "I think that we're doing a darn good job. I just think that there might be some things we want to pay attention to going forward."

MORE, BETTER DATA

One thing hampering even the best designed model, be they internal or external, is a lack of data. Unlike, Hurricanes, which happen annually, there are relatively few major earthquakes for modelers to use to validate their models. "This is an empirical science," notes Imelda Fox, principal and SVP of Philadelphia-based Towers Perrin Reinsurance.

Adds Castaldi, "Whenever, there's a major event, we'll use it to learn from and calibrate our models."Castaldi says a seismic event or even an absence of events can be cause to re-evaluate a model.

Modelers also can avail themselves of studies being performed by the government and academia. Last year the United States Geological Survey completed a major revision of its California earthquake catalog, which will be reflected in models this year, Castaldi notes. Kennedy adds Benfield also reviews earthquakes that occur in Turkey and Japan.

Modelers also can now tap deep wells of database information. Until five or 10 years ago, Kennedy says, companies had a hard time getting access to the all the pertinent information about the properties they covered, even such essentials and the year constructed, building height and soil type. Now, in the age of the Internet and geocoding, such information is much more readily available. "A lot more modifiers are used that can help refine the damage factor that the model uses," Kennedy says.

In addition to the primary modifiers, secondary modifiers, such as the presence of unreinforced walls, are being increasingly used to make the models even more sophisticated. "The data insurers capture has become so much better over the last few years, it's given us this really rich claims data set to work with, to try to understand what is driving the losses for an individual portfolio, relative between different construction classes," says Don Windeler, director of model management of RMS.

Holtshouse concurs and notes that over the past two to three years, the company's estimated PMLs for Ohio lowered or remained somewhat constant in spite of increasing amounts of total property insurance values as it has fed more complete 'street level' address data into the models. "The quality of the data we provide to the models has improved dramatically over the years," he says.

Another factor helping make better models, is that as computing power has increased, it has allowed researchers to try more sophisticated simulations. Models can simulate ground motions from earthquakes and gauge how different structure and soil types are going to perform. "There's continuing research that's going on in both the seismological community and engineering community to understand how structures perform under strong shaking," Windeler says, adding that insurers are using the models in much more sophisticated way. "They're slicing and dicing the outputs."

TIES WITH UNDERWRITING

Thus, the goal for carriers is to have sophisticated models firmly embedded in their underwriting process, allowing them wide views of their exposures.

Kennedy says that although an individual building may pass a model, it may become a risky proposition when aggregated with other buildings in the neighborhood, adding that Benfield has a risk-mapping product, called ExposureView, which allows carriers to look at the accumulation of exposures on a zonal basis.

Indeed, it's important that insurers realize models are part of the underwriting process and not a substitute for it.

"In 2004 and 2005, multiple events happened and they were bigger than people anticipated," Kennedy says. "Even though there was only a small probability to it happening, it did happen. So people started to not rely on the models 100% anymore. They are now also looking at accumulation of exposures and going back to old fashioned underwriting. They're not putting all their eggs in the modeling basket."

Holtshouse agrees that models needs to be integrated with solid underwriting. "Overall, we've been impressed with the ever-higher level of sophistication of the models," he says. "But it's important for us and the industry to remember that it's just one tool, just an estimate, and not the 'gospel'."

(c) 2008 Insurance Networking News and SourceMedia, Inc. All Rights Reserved.

For reprint and licensing requests for this article, click here.
Security risk Data security Policy adminstration Claims
MORE FROM DIGITAL INSURANCE