How Tokio Marine HCC builds its underwriting risk capabilities

TMHCC_Houston_Office.jpg
Tokio Marine HCC headquarters in Houston, Texas.
DANIELORTIZPHOTOGRAPHY

Digital Insurance spoke to Chris Skarinka, chief operating officer of the professional lines and public risk groups at Tokio Marine HCC. He leads a team of 100 staff members to support the carrier's property and casualty coverage of public entities such as police and fire departments and water and sewage authorities. Skarinka and the team assess their risks and organize data and risk operations. Recently, Tokio Marine HCC began using an underwriting workbench from Unqork

How does the work of the public risk division support Tokio Marine’s underwriting business?

Christopher_Skarinka_Headshot.jpg
Chris Skarinka, chief operating officer of the professional lines and public risk groups at Tokio Marine HCC.
We operate as the Public Risk Group of Tokio Marine HCC (TMHCC), a specialty insurer with deep expertise in multiple verticals and classes of business. As a P&C package provider, the Public Risk Group touches almost every single line of coverage. For example, we write EPL [employment practices liability], D&O [directors and officers] for public officials, property, and auto liability insurance. 

Given this breadth of coverage, a lot of the work we do from a technology standpoint can be repurposed within the enterprise as a whole. For instance, if we introduce a new underwriting or risk management tool for employment practices liability, TMHCC can leverage that tool for its separate EPL group and vice versa. There's a symbiotic nature to what we do relative to the rest of TMHCC and Tokio Marine, our parent organization.

While we all focus on a certain class of business or have expertise in certain components, we all receive a submission, produce a quote, and bind a policy. PRIME, the Public Risk Group's new online submission system, developed with Unqork, helps accelerate and automate much of this process. Specifically, we have built a fantastic underwriting workbench with APIs to third-party data sources and internal platforms, along with workflows and business logic, that has functionality that can be leveraged by others who may not be focused on public entity coverage. Our colleagues across Tokio Marine can "copy and paste" what we've created and implement these processes into their own workflows, and with only minor tweaks and changes. 

In recent years, have you seen changes to the logic and models being used for evaluating all the risks?

Certainly, in the public entity space the models are changing as a result of the shift in the risk landscape. In law enforcement liability, specifically, public perception is much different over the last three or four years in the U.S. than in previous decades. 

In 2010, there may not have been a significant amount of attention paid to this line of coverage from an underwriting and modeling perspective. Now, underwriters are digging into this exposure at a deeper level. For example, how many police officers, what type of police officers, whether there's a highway through the entity that might mean there are more stops per officer are all factors that are considered. There's also a greater focus on jails in today's environment, including analyzing size, crowding, policies and protocols.

We're starting to see risk management components being brought into rating and underwriting more frequently than we used to in the public entities market. Technology is enabling us to analyze more data and to react more quickly. Especially for public entities, where much of the data is publicly available, there are significant opportunities to leverage that to provide more tailored products.

What elements of cost reduction and revenue generation go into underwriting and technology models?

The biggest focus for us is on revenue generation. The insurance industry traditionally has viewed technology as a cost center, but we need to shift that perspective. Automation, data pre-fill and data validation make it easier to do business with us; that should lead to additional revenue. This technology also helps us get to market faster so that we can introduce innovative new solutions for our insureds and partners. The ability to ingest more data more quickly makes our models smarter, which has the opportunity to improve profitability. 

I mentioned risk management data earlier. Including this in your underwriting process leads to more optimized quotes. For example, factoring in info about a levee system may indicate that flooding is less likely. From an EPL perspective, certain HR policies and procedures followed by a public entity may indicate lower exposure. Cyber is at the vanguard of this and doing a great job of automating this data ingestion and analysis. Over time, we should see other classes of business pursuing a similar strategy.

What has changed in applying technology to insurance for climate risks and events?

We think about this in two primary ways: before the climate event occurs and during/after an event. From a before perspective, there is much more atmospheric, climate and ground truth data available today than even five years ago. Our industry needs to accelerate their efforts to ingest it, analyze it and marry it with the underlying exposure information to turn it into actionable decision making for underwriters. We also have much more data available today on the structures, the built environment, resiliency and mitigation aspects related to climate. If we use a public entity example, combining this data could illustrate a police station might have a lower climate risk, which could lead to lower pricing. 

If we can better track climate risk, we can better predict it. Ten years ago, many in the (re)insurance space referred to hail, flood or severe convective storms, as secondary perils. Today, the perspective is different. They are now much more in the primary category due to the level of losses. The industry needs more focus from the modeling providers on what used to be secondary perils. For example, the amount of pavement/concrete in the built environment across the U.S. is much higher today than 15 years ago. And pavement/concrete does not absorb water as well as natural landscape. Even if there were no change in the climate over those 15 years, flooding risk would still be different because of the increase in pavement/concrete and structures. We need to ingest that information into our models in order to more accurately predict risk. 

We insure a large water and sewer district in Florida. During Hurricane Ian, it leveraged Facebook to keep in touch with its customers about what was happening. Also, its customers were able to update it via Facebook on their location and whether they had sewage/water in their house. Even basic forms of technology, like in this case, are improving risk and situation management. Not only does this have the opportunity to keep people safer, it can also reduce risk. We view this as a clever and low-cost way to leverage technology during a climate event.

Will there always be a need for human insight into what data is important?

In the public entity business, there's no question that you need human supervision and oversight given the vast spectrum of exposures. As an industry, we can be most powerful when we combine machines' ability to ingest and analyze huge amounts of data with humans' powerful intuition and experience. 

The future for us is to feed the most critical actionable intelligence to underwriters at the right time, to be able to quickly make well-informed, expert decisions.