A New Look At Operational Risk

Who knows what the future holds? While the U.S. Department of Homeland Security is actively investigating terrorist communications and activity to prevent possible attacks, insurers are beginning to understand the necessity of taking a new, more proactive approach to assessing their own risks of loss related to people, processes, technologies or external events.These "operational risks," as defined by the Bank of Inter-national Settle-ment (BIS) in its proposed Basel II accord, are exemplified by events and failures such as Enron and WorldCom, the power outage in the Northeast last summer, the continual stream of computer viruses infesting the Internet, and, of course, the September 11 terrorist attacks.

But operational risks also include every day "failures," such as employee errors, system downtime and the loss of key talent. All in all, 90% of 435 financial services firms recently surveyed by SAS, a business intelligence provider with U.S. offices in Cary, N.C., say they lose $10 million per year through poor operational risk management, and 35% lose as much as $120 million per year.

And, while Basel II does not directly affect the U.S. insurance industry, the proposed international regulation-which will require banks (and European insurers) to measure their operational risks and set aside capital to cover possible losses-will have a trickle-down effect on North American carriers, industry experts say.

"My prediction is that (the Basel II accord) will affect the insurance industry overall," says Vincent Oliva, head of the financial services vertical at Stamford, Conn.-based Gartner Inc. "An insurance company that thinks it will not be affected by (Basel II) in the future really has its head in the sand."

Initially, the Basel accord was geared toward credit losses and credit risk management, Oliva notes. "But if you really read the proposed regulation, the BIS is talking about reserve for losses, they're talking about the financial services industry. Tie a bow around it, and you're talking about insurance."

A clear leader

Toronto-based RBC Insurance is a clear leader in the area of operational risk management, industry sources say. As the insurance arm of the Royal Bank of Canada (RBC) Financial Group, RBC Insurance is benefiting from RBC Financial's enterprisewide approach to risk management.

In the fall of 2002, RBC appointed Grant Hardy as executive vice president and chief risk officer of its insurance operations. In this newly created position, Hardy was charged with managing risk across the insurance businesses as outlined in RBC Financial's risk pyramid.

At the top of the risk pyramid is systemic risk, which an organization has little or no control over, Hardy explains. At the next level are competitive, regulatory and legal risks, which the organization has some control over, followed by strategic and reputational risk.

"At the bottom of the pyramid are risks we believe we have the most direct control over," says Hardy. These include credit and market risk, liquidity, insurance, and operational risk.

Prior to Hardy's appointment, no one within RBC's insurance operation was focused specifically on risk management. "The actuaries were doing some of it in the business units," Hardy says. But when RBC Financial implemented a structured operational risk management program in 2001, risk management at RBC Insurance was elevated as well.

"I was put in this job to make sure we consistently measure insurance risk across the different lines of business-and to make sure we have those measurements approved at the financial group level," Hardy explains.

Approximately two years ago, even before Hardy took office, RBC Financial Group began segmenting each of its business platforms, including insurance, into operational risk entities. Each operational risk entity then went through a carefully designed risk and control self-assessment. "RBC conducted a whole work-up of each operational risk entity to determine whether it was high, medium or low risk," says Hardy.

Although risk management is centralized under Hardy's purview, "the process we're using for self-assessments is designed to give the business units control, rather than making this a functional unit responsibility," he says. "We want it to become a part of our culture to coordinate, oversee, and assess risk."

To that end, RBC also is using a technology platform from Amarillo, Texas-based Portiva Corp. to capture operational risk data. That data includes root causes of losses, controls, effectiveness of controls, impact scores, likelihood scores, residual risk scores, and trends and action plans.

In addition, RBC Insurance is developing its own database to collect loss data associated with employee errors, which was identified as one of the company's top operational risks, according to Hardy. "We're trying to assess the costs of errors such as paying a claim we didn't have to pay or refunding premiums on policies because they were incorrectly processed," he explains.

Good business practice

The collection and analysis of operational loss data is still evolving at RBC. "We're trying to make sure we know what's going on, how we can use this information as an organization, and get risk assessment and control entrenched into people's thinking," says Hardy. "In many respects it just becomes continuous self-improvement. It's good business practice."

Indeed, comprehensive risk management is aimed at institutionalizing operational best practices that produce a profit, whether that profit finds its way into dividends to shareholders or internal partners, says Gail McGiffin, associate partner at Bermuda-based Accenture. "It's just part of effective management."

Operational risk has been an audit issue for a long time, but the approach is new, notes Deborah Williams, group vice president, corporate banking and capital markets at Financial Insights, an IDC company based in Framingham, Mass.

"The primary difference is that audit is a reactive function, while operational risk management is a proactive function," she says. "Operational risk management is meant to be something that's part of the everyday business process. It's meant to be owned by the business line. It's meant to be consistent. And management is meant to use operational risk information to better manage their business."

Not surprisingly, empirical evidence suggests a correlation between corporate risk quality and financial performance. One recent study reveals that companies with high risk quality have low cash flow volatility.

"These findings provide evidence for what is intuitively understood by corporate executives-that improving risk quality is a key driver of sustained shareholder value," says Deborah Petty, principal at Oxford Metrica, the strategic advisory firm that conducted the study of 438 companies across industries.

Perhaps because of executives' intuitive understanding, along with regulatory and competitive pressures, more than half the financial services companies surveyed by SAS have initiated a coordinated operational risk management program in the past three years.

But 57% of insurance companies have no program in place, while 75% of retail banks do. "Insurers certainly understand their risks, especially with 9/11 and the collapse of Enron and WorldCom and the stock market volatility," says Ritu Jain, SAS global insurance strategist. "They're all focused on improving their bottom line. And for that purpose, they're looking at operational efficiencies and reducing risk," she says.

However, unlike the banking industry, which is preparing for Basel II, there is no similar regulatory pressure on the U.S. insurance industry. "They don't have to keep capital aside for operational risk," she notes.

Still, publicly owned U.S. insurers are facing significant potential penalties associated with Section 404 of the Sarbanes-Oxley Act, which requires executives and auditors to attest to the effectiveness of internal controls over financial reporting-an operational risk issue, to be sure.

"Basel II and Sarbanes-Oxley are closely related," says Financial Insights' Williams. "While Basel II looks at the management of exposures and tracking those exposures, Sarbanes-Oxley Section 404 looks at the controls in place and how effective they are."

Areas of priority

In fact, compliance with Sarbanes-Oxley and operational resilience in the face of loss events are two new areas of priority for IT investment at insurance companies, according to Deborah Smallwood, insurance practice leader at TowerGroup Inc., Needham, Mass.

"During the blackout of 2003, a lot of insurance companies were able to bring their data centers up and bring up their computer systems and business continuity plans," she says. "But they found that their processes and their people were still very vulnerable."

Likewise, Sept. 11 revealed the exposure insurers faced with the accumulation of brokers in the World Trade Center, says Accenture's McGiffin. "Prior to that, the insurance industry didn't view itself as having an accumulation of exposure relative to personnel," she says. "That was a wakeup call."

In fact, inadequate or loss of people resources is one of the top three operational risks identified at RBC Insurance, according to Hardy. "We've put together a number of businesses with rather small companies over the past seven years. In the process, you end up with those people being the experts, and you have very few experts," he says. "The loss of people resources at this point in time could hurt us."

A related risk, which RBC Insurance also has classified among its top three, is systems capability risk. "Given that we've done a number of acquisitions, getting to one platform is a concern," says Hardy.

When it comes to technology investments, a number of tools are necessary to measure and mitigate operational risk in a comprehensive way, according to sources. "It's not a nice, neat 'buy-one-and-install-it-and-it-solves-the-problem' kind of technologies," says Financial Insights' Williams.

Instead, a number of IT challenges are associated with operational risk management. In addition to Web-enabled risk self-assessment tools, another key technology is a loss database, which a company uses to collect detailed information about operational failures, according to sources.

For Basel II, any failure in people, process, technology or external event that leads to a $10,000 loss or more becomes part of that loss database, Williams notes.

"So the teller who decided to abscond with the drawer will be in there. The agent who submitted a fraudulent claim and directed the money to his own account would be in there. The 'Oops, I forgot to turn the computer on and the file was never sent' would be in there," she says.

U.S. insurers, however, are struggling with data management and data quality challenges-regardless of the issue of operational risk management.

Most insurance companies are still using legacy systems, although they are beginning to replace them, notes SAS's Jain. "They need to bring in better data management capabilities so the data is not in silos. They need to enable their applications to talk to one another. They need to re-engineer some of their business processes," she says.

Currently, different lines of business in most insurance companies have different methodologies for measuring and pricing risk, Jain adds. "Integrating these systems will enable them to look at the co-dependencies between various risks." With Sept. 11, insurers didn't only bear property damage losses; life insurance losses and disability losses were even larger, she says.

Ideally, once insurers collect risk self-assessment data and loss data, they will then use analytical engines to perform predictive analyses-such as worst-case analysis and probability of losses-similar to what is now done for credit and market risk, says Financial Insights' Williams.

"The trouble now is having sufficient data to perform these analyses with any statistical assurance," she says.

Scratching the surface

Another obstacle is data quality, says Gartner's Oliva. "Insurers are struggling with their legacy systems and replacing them is a huge cost," he says.

"They're making use of Web services and middleware to extract data from these systems. But the problem is the data itself is often not clean. It can be corrupt and in various formats. So a huge amount of standardization and cleanup is required." (See "Mapping Technology Exposes Data Entry Errors," left.)

"Insurers are talking to us about how to better manage their operational risk," says Jain. SAS offers operational risk collection and analysis products, as well as a global loss database. "They're asking what do they need to be doing? They're scratching the surface right now. They're trying to educate themselves. In the next five years, we'll see a lot more improvement."

Mapping Technology Exposes Data Entry Errors

Gen Re Intermediaries is not in the business of uncovering operational risk for its clients; the Stamford, Conn.-based subsidiary of Gen Re advises property/casualty insurers on their catastrophe risk. But, the company uses mapping technology to visually depict property catastrophe exposure for its clients, and, in the process, it helps them uncover their data entry errors.

"One source of operational risk is data entry where you've captured incorrect policy information-whether it's coverage data, location data, or limits data," says Anthony Sammur, head of catastrophe modeling at Gen Re Intermediaries.

Using a tool developed by Troy, N.Y-based MapInfo Corp., Gen Re Intermediaries imports policy data from its clients, geo-codes it and produces maps that illustrate their geographic exposure.

"The tool enables our clients to look at their risks on a map-to look at each and every dot-and notice, for instance, 'We don't write risk in that territory or region, so why is this dot showing up there? We'd better go and fix that,'" he says.

"We've seen data sets with an 80% probability of error," says Sammur. All in all, Sammur says his firm sees an average 20% error rate in the client data it receives-although not all those errors are significant or relevant.

Still, "that's a huge source of operational risk,' he notes. "If you're asking us to model your severity potential from natural peril hazards such as earthquakes, tornadoes and hurricanes, and your data is incomplete or inaccurate from a location standpoint, you're not going to have much faith in the results."

For reprint and licensing requests for this article, click here.
Analytics Security risk Data security Compliance Data and information management Policy adminstration
MORE FROM DIGITAL INSURANCE