MassMutual matches AI with data science

DI-MassmutualHQ_08302017
MassMutual headquarters in Springfield, Massachusetts.
White Point Imaging

Digital Insurance spoke with Sears Merritt, head of enterprise technology and experience at MassMutual. The carrier has been through migration of legacy technologies to more up-to-date systems and unifying its customer experience. Merritt works on simplifying and modernizing the customer experience, as well as building the carrier's digital capabilities for handling data. MassMutual harnesses AI, APIs and data science to manage data in a way that provides value for the firm.

What functions do you work with the most to better handle data and modernize operations?

Sears Merritt -MassMutual.jpg
Sears Merritt, head of enterprise technology and experience at MassMutual.
Across the entire policyholder lifecycle, we've got core initiatives to build brand new greenfield digital capabilities that enhance the service experience. We're really on the margin, innovating and improving those digitally enabled experiences including call center, post issue, in-force servicing activities, fraud detection, lapse management, cross selling and advisor engagement. 

From an advisor perspective, how do we make it easier for an advisor to serve their clients? What capabilities, automation and corresponding modalities do we need? Whether that's some sort of app, or website or portal, or a great call center human-in-the-loop experience to deliver that. We're truly focused across the entire lifecycle, with the most emphasis on in-force servicing, new business and underwriting.

How does MassMutual incorporate AI and data science into those areas and others?

A lot of new AI capabilities have come into the marketplace, doing things that I'm not sure anyone was expecting even just a few months ago. We've really made sure that we have enabling capabilities in place to take advantage, whether we choose to build in-house based on our data assets or take advantage of more features showing up in core systems we use across the company.

Our core focus for data and automation is making sure that we have a rock solid, modernized, easy-to-use data infrastructure that makes the right data available at the right time through the right modality. We put a lot of time and effort into being consistent in how we build out and expose data to our different systems and teams. It's making sure that our core systems and interactions that happen on them are generating data that is consumable to connect downstream, drive process automation or feed into AI and data science capabilities and applications.

Similarly, we've spent a lot of time in the data lake / data warehouse space, doing a complete revamp or replatforming of all of that infrastructure into one enterprise data environment combining a couple different things. We have a data lake and a core, pretty structured analytics warehouse that contains all the information across the entire organization. We've again organized that in such a way that it can be used to drive business processes. It can be used to drive in real-time analytics and reporting and everything in between.

What are the challenges in connecting the APIs, AI and data?

In the early days of getting all of this work started, it was just the basics. Where does what data live? Which system has collected it? How is it represented? Do we have all the metadata that we need to understand the conditions by which the data is generated, what the information means and how it can be used for these different kinds of activities? When we first started working on a lot of these modernization activities, particularly in the data space, that was a big emphasis. We set up a data governance team, got the right tooling and process in place. 

Fast forward to today. We invested a lot to provide easy-to-use abstractions on top of all that data. When we build our micro services, we use an enterprise object model, a common logical way to represent all the data in the company. We make that data available to our developers so they have just one way to think about and consume all that information, whether we expose it through an API, whether they get it directly from a database table or a file in a data lake – or alternatively through a topic on one of our streaming platforms.

How do you think about data management? How should insurers think about data management?

Very carefully, to put it mildly. They should think about it in the context of their business strategy, both in how they can use the data to create value for themselves and for their customers, and also how they can use that data to actively manage and mitigate risk to their policyholders, regulatory risk, industry risk and things of that nature. A mature data management practice enables anyone in the company to understand under what conditions they can use a certain type of data on a context by context basis.

If you can do that, and you created a platform that allows data accessibility to occur relatively easily, then you have a good data management practice. The context piece is really important, because the answer to the question is going to be a function of where you are in the policyholder lifecycle. If you want to design a marketing or brand campaign, the data you want to use and the conditions in which that data was generated might look very different than for using data further on in the lifecycle, like for underwriting. There's many reasons why your data might be suitable to use for a campaign.  But on the other hand, you might want to consider that off limits or unsuitable to use in other processes in the life cycle.

What is the future for MassMutual’s technology operations?

The future is coming fast. What we see developing with large language models is bigger than many in the industry expect. That will bring a level of change disruptive on a scale that many of us in technology leadership haven't seen in our careers. 

From a pure technology perspective, we're looking at disruptions in how we do computing. Quantum computing, for example, is something we're watching, monitoring and figuring out under what conditions we would want to take advantage of those kinds of technologies, whether it's risk management, investment activities or cybersecurity. 

Going back to data, with wearable computing, there's lots of opportunities to integrate with wearable devices to improve policyholder experiences, whether from a new business and underwriting perspective, or even from a post-issue engagement, financial planning perspective. We anticipate that technology will become more valuable and key to the customer relationship.

How will advances in AI, like ChatGPT, fit into communications and customer service?

It's going to impact us everywhere, quite frankly. I don't think it's just limited to how it benefits a policyholder from an experience perspective. There are lots of ways this technology will be brought to bear, including using these capabilities to provide the right information at the right time to policyholders, in the way they want to consume it – whether through a chat, a phone call or an email. There's still a lot of questions around how we should use this, based on the data used to create these capabilities. 

From a productivity point of view, how we get our work done is going to dramatically change in the technology space. We hear about assistive technologies that improve productivity and speed up our ability to create and build. That goes across the entire enterprise: legal, compliance, marketing, communications. We have a large portfolio of use cases and pilots that we're exploring to figure out where this capability can be put to use.