Applied AI: Smarter, Faster, Better: Harnessing The Power of Data With Machine Learning

With increased frequency, Insurers are using advanced machine learning to drive smart, automated applications in fields such as healthcare diagnosis, predictive maintenance, customer service, automated data centers, self-driving cars, and smart homes. Machine learning can be effectively applied across structured, semi-structured, or unstructured datasets, as well as right across the value chain to understand risk, claims, and customer behavior, with higher predictive accuracy. Working with machine Learning does of course have its challenges.

Transcription:

Nate Golia: (00:09)
Thank you so much. Hi everyone. Thanks for coming in for our session. I changed my shirt after the morning session, I was pretty sweaty. If you remember my conversation with Bill Pappas, I put the questions on my phone, which I did not realize the optics of until I was about to look at them. We're going to get started. I just want to let our panelists introduce themselves. We'll start right here!

Ryan Rist: (00:30)
Everybody, Ryan Rist, I lead innovation and USAA on insurance side. Focus primarily on our property business underwriting. A lot of data analytics work, also be a lot of our safety loss prevention. So probably know USAA is a very mission focused company. A lot of our innovations around how we build ecosystem of value for our.

Tim Carlson: (00:55)
I'm Tim Carlson, been at Travelers, about 16 years. I'm part of our enterprise AI team that's really purpose is partnering with their different business partners, lines of business to deliver value and accelerate our capabilities across the enterprise.

Nate Golia: (01:08)
Great. Thanks so much. So if you have been following actually a lot of what's been spoken about already today, but also what's been going on in the past decade or so with insurance and insure tech. What's really been going on is we have been take the insurance industry has been taking its data that has collected over century or decades of the insurance business and now we're trying to use that data to create better experiences. One way to we talk about that is through AI and automation. I start with our panels to talk a little bit. How have you seen attitudes towards using AI evolve as people have become more comfortable with the digitalizing insurance enterprise? Did you want to start Ryan or Tim?

Ryan Rist: (01:48)
Sure. I think starting from sort of the business side business partners, people inside of the company, I think we've started to see models, predictive models be used in really meaningful ways. I think oftentimes if you've been insurance for a long time, you're probably not a native technologist, right. Some of this is sort of deep technical work. I think now we're bringing predictive models to market that are personalizing coverage for people. We can give our members a prediction of how much coverage they need at a certain point in time. We're able to use computer vision and see changes in property over time. We're seeing these use cases hit the market and have meaningful impact and I think that is creating a lot of interest in wanting more of it. I think, if I go pre pandemic there was a lot of activity. I don't know if everybody was bought in as data science as a really core capability. I think that that's changing. I think the pandemic forced us to do things maybe that we didn't choose to do beforehand. If you can't physically inspect a property you have to get comfortable with predictive models and data. We're finding out that it works and it has value across the entire value chain. So I think that's what I've seen last three, four years.

Nate Golia: (03:07)
Did you want to?

Tim Carlson: (03:08)
Yeah, I think it's pandemic. I think also technology has gotten better, more compute power ability to do things that maybe couldn't do a few years ago and certain in terms of like attitudes changing. I think a lot of it's education and as there's been more education out there, I know partner with lines of business. One of the key things when my dear business partner, she's a former actuary, she's like, it's not magic, it's math. We always kind of push that with our business and a lot of just kind of education and realizing if a human couldn't do something. So, think about kind of damage to a property, if there's differences in opinion between claim professionals between what's kind of a medium damage and what's a total loss, how we're expecting, how could we expect AI to solve that. So I think there's just the education component and setting the expectations what's possible. And as we start to get some wins in showing what we're good at, and maybe what the technology isn't quite capable of doing yet, that's just led to, I think, a better spot than maybe a few years ago.

Nate Golia: (04:06)
Let's talk about a little about the experience that people are sort of learning to expect in this early stage here. Maybe, you could just talk a little bit, like, not just there's the customer experience, there's also the internal experience. Like are the stakeholders that we're talking about internally? Like how are we seeing those start to align as people in the insurance and on the insurance enterprise side start to understand what the customer wants and starts to see how automation can feed that.

Tim Carlson: (04:29)
I think we are seeing where things are working well and it's a lot of it is how do we, as we work as an enterprise group across a number of different lines of business, like how do we replicate maybe some things that worked really well in one area, how do we replicate that in other areas? And it's not always and actually it's really never a lift and load coz there's different business processes, different system a lot of different considerations that you need to take into account as you are delivering those. But I think it's all around figuring out what you're good at diving deep into those versus, I think, a few years ago going broad and trying to solve everything for everyone, it was an approach. We had taken and had gone after and it wasn't necessarily as it successful as kind of narrowing our focus.

Ryan Rist: (05:14)
Yeah. I see it really forcing technologists and engineers to work hand in hand with people that understand insurance. I think that, historically we have gone through waves of how we handled IT and IT has sort of has been segmented off and now it's coming back to where we're working hand in hand. If you think about it, people that know insurance really well understand where they want to take the business, what the drivers are, and it's a complicated business at the end of the day and the technologists offer up ways to get stuff done. So the more closely we can work with business people and technologists together to solve problems, I think that's where the magic is at. And I think it's materializing in things like data science, where you can have a data scientist who's really good at what they do know nothing about insurance. And if you don't pair them with somebody who understands the problems and insurance you'll get nothing done, you'll get science projects that go nowhere. I mean, I think that this is forcing insurance companies to act more like technology companies, which is how we should act because that's ultimately really what we offer as is a intangible data product. Right. And so we have to think like a technology company would.

Nate Golia: (06:21)
Right, I think that's been touched on a couple times already and as we're talking about automation encompassing new kinds of technology, now that we've reached sort of a baseline with what to expect from automation and how people are understanding that it's going to work in the insurance, but that it's going to come to insurance. How should companies start to think about adding AI and machine learning to those automation projects and taking it to another level?

Tim Carlson: (06:45)
Yeah. I'm not sure that we've reached a total baseline yet. I think there is definitely more education to go and as learning more in terms of starting to apply machine learning. I think back to the keynote this morning, kind of a like the, what if statements and one of the areas been really successful has been assessing claim damage after weather events and within 24 hours after these events we see across the US, some of these hail events, the fires, we actually have pitchers being taken, aerial imagery, we're running models against those and assessing the damage and understanding where some of our major losses are. That's been really game changer, not just internally as we think about how we process claims and start those, but really to our customers. So if you think about going through one of these weather events and you're in a really kind of vulnerable spot. So if we're able to reach out to you proactively, start the process of getting your claim started getting some money, if you're in a hotel all those things that kind of deliver on the Traveler's promise that's a really big win for us.

Ryan Rist: (07:46)
Yeah. I think really good distinctions are made by the keynote and others about the difference between automation and intelligence and we're talking about intelligence here with machine learning and deep learning and neural nets, and I think, there's a lot more possibility and value in the latter. I've really encouraged people to experiment, run a test and do it with sort of a business outcome mindset say, Hey, if we had a prediction machine that could this, what would it mean to our business? Would it drive up sales? Would it lower our cost structure? Would it increase sophistication on pricing underwriting, and then go run that experiment with a small sample, build the case for it and then scale it. I think, that's another problem. That's not just data science related. It's also innovation related. That's that's oftentimes we don't place small bets. We don't experiment in really small fashion to prove our ideas out before we go scale them, if you prove them out, you'll have a strong business case, the CFO will be your best friend and you'll get the funding you need to bring things to market. So I think there's also a path to get to market that I spend a lot of time obsessing over on the innovation side of things.

Nate Golia: (09:00)
Can you guys share any examples of getting to this next level of these sort of maybe small successes that you've had, something that you can talk about from real world application?

Ryan Rist: (09:12)
Well, I'll just give you predicting coverage we're mission focused company and if we can our mission is to facilitate the financial security of our members and that is first and foremost in everything we do. So while there probably are expense reduction models, we could deploy, we're obsessed over making that experience frictionless better. So think about routing calls based on intelligence, right? Should it go to this group or that group? Well, if you've got a urgent case or maybe more tactical case, it should go to that group. If you've got a propensity, openness to maybe looking at new products, selling, maybe it goes to another group. Certainly, on the product side of it's when you really dive into what we ask of consumers to know about insurance and know about their property and know about their assets.

Ryan Rist: (09:59)
There's some companies doing great things out there, but we still got a long way to go in simplifying that and predicting what people need. We can predict whether or not you're likely to have a non weather water loss. Well, why don't we pair you with a smart home technology that you can get a deep discount on through our scale? We can predict that you're going to need water backup coverage based on your property, where it's located in the US the features when it was built, what is plumbing consists of? Those are really powerful ways to take friction out of the process to build enhance trust. That's our whole kind of purpose in life is to build and enhance that trust.

Nate Golia: (10:44)
Do you want to?

Tim Carlson: (10:46)
I mean, I know mentioned that the claim example, which is one of our capabilities in market, I mean also looking at submission intake. So, submissions coming in from agents and brokers, how do we more quickly bring those through the process? So you're not waiting hours and days to get a quote returned. So what are the steps that get taken once those submissions come in to quickly identify who the insurance is for all the various aspects of that submission that's coming in. So, a lot of variables and a lot of different ways that we're applying machine learning to more quickly return response to our agents and brokers.

Nate Golia: (11:21)
Great. I'm going to go a little off script here, cause I wanted you guys to expand on a point that both of you sort of touched on it. I know that we actually exported there and that's the tipping point for going from automation to AI and machine learning. So what you said Ryan was talking about in call routing, like we're trying to route calls intelligently, so that people get their questions answered. Now in the past, you're getting your call routed at all was revelatory. The fact there was some sort of automated way to bring people in was enough, but now people have become to are now looking for next level of experience. I wonder if you could talk about like managing those growing pains a little bit both, if you could talk about managing those growing pains, understanding that like people are going from an automated experience that might not have had the greatest experience with the last time they had to now somethings going to be better and how do we start to get people to understand that this is going to be a better experience, even though it is still being mediated via digital?

Ryan Rist: (12:15)
Well, I think, there's a function of that is just consumer expectations. And I think even though we've been in a pandemic for two years, this trajectory we were on prior to that of consumers expecting better and better experiences, your last best experience is the expectation of every future experience that hasn't changed. In fact, it's gotten even more. So, I mean, how many people had groceries delivered in the past couple years? How many people had a telehealth call with their doctor,? So much happened that is actually still very much alive and well. So we, as companies are still playing that game, we're competing with the Amazons of the world to deliver a great experience. I don't think that's slowed down. I think companies have realized that they can do it.

Ryan Rist: (13:01)
There's a lot of, I would say angst in that. I mean, bringing those models to production is a big deal. You're changing an experience. So that's where I think experimentation can help. Can we run this in a confined way to get everybody in the company, comfortable with the outcome before we launch it globally? And I still don't think insurance companies are good at experimentation. A lot of it's blamed on regulatory compliance, but the fact of the matter is if the companies are going to win, I think are going to be able to experiment partner quickly and try thing and it can find way once you have the data that says, this is what the experience was. A lot of those walls fall down. The detractors say, well, you're right now, let's move it to the next phase. We tend to think nationally or nothing in insurance or statewide or nothing. And we have to think more to micro experimentation level.

Tim Carlson: (13:54)
Yeah. I mean, I agree with that. I think, part of the answer is also, it depends, right. Depend just because something has automation built in today doesn't mean it's a great candidate for AI and actually think there's more examples where we say, this process is not a great candidate for a variety of reasons. It could be the data's not there. It could be, there's a lot of legacy systems that's be really hard to integrate with, could be that the business process has a lot of different factors. There's a lot of variability. So, I think there's a lot of reasons why it maybe is not a great AI candidate, but I think to the experimentation point, I mean that's a really good thing where good point where if you start small, you improve something out and, and then you can start to expand on that and it doesn't necessarily need to be this huge win and go for the grand slam every time.

Tim Carlson: (14:37)
What are those like small things you can build out, but also they need to be meaningful cuz with every solution, every model you're rolling out there is something there from an overhead standpoint, like you need to maintain the model you need to monitor, you need to make sure it's operating as you would expect. So there is a lot that goes along with that. So it's, I think, figuring out like what those wins are, expand those, but then also having the right infrastructure, culture team in place that's able to support it going forward.

Nate Golia: (15:03)
Yeah. I think that, the way to sort of think about is I'm just thinking about what you said about things. Not necessarily being a great candidate for automation, like maybe yeah. There, maybe the AI technology isn't there, maybe the data isn't there yet, maybe there is a regulatory barrier at some point that isn't going to stop it. I think that, how should insurance companies be thinking about prioritizing projects? Like, how do you decide in your enterprises? Like we want to experiment on here because we think it's a good candidate. How do you find those good candidates internally to start applying these technologies too?

Ryan Rist: (15:39)
Well, the ideas come from, sorry.

Tim Carlson: (15:42)
No, go ahead.

Ryan Rist: (15:43)
The ideas come from everywhere. I think part of the solution is not cutting your innovation or R and D funds just because it's a tough year. Persistent, dedicated focus to that is very hard. Leaders come and go priorities shift. The market is up and down. I think, especially right now with what we're facing will your company, put that funding there. Later this year, that's a problem. You lose a lot of talent, you lose compound learnings, you have to start from scratch the next year when it's the hot thing to do. So I think there's more like institutional challenges around. And then I think the people that are running those departments are accountable to drive value to the organization. So understanding the financial levels of your business, proving out that it has value. It's something we were having conversation about today. It's incumbent upon us to prove out that there's value here. And then I think it becomes a lot easier. It's not easy but easier.

Nate Golia: (16:48)
Tim to join I'm!

Tim Carlson: (16:49)
As I said, I think the prioritization kind of goes back to like, what are the business goals? What are your targets that the business has and figuring out what are the levers that you could potentially pull as it relates to the different processes they have, where you could potentially, roll out solutions that are either going to improve customer experience are going to increase. Throughput of submissions are going to have CLA faster claim resolution. I mean, there's a number of different ways to look at it, but I think it all just needs to tie back to kind of the business goals. And then those other factors I talked about previously, also really helped kind of determine what's the right candidate.

Nate Golia: (17:24)
We've got one more sort of three part question, and then we'll we can open up to the audience, but we're talking about for insurers, we're looking to implement machine learning talking about the challenges that are emerged. We had identified sort of three key areas and some maybe prescriptions for how to work with that. So let's start with the technology environment. Tim, you mentioned legacy systems already. So how someone might be saying, like, I got too many legacy systems to do this, where does someone start there in order to,

Tim Carlson: (17:53)
Well, I'd say rarely in all the work we do. It's rarely a technology problem in terms of like building like an AI model building the solution. But it is very much a business process and kind of integration model. So you can builz something fantastic on the side. But then figuring out how you integrate that back into your tech stack and, integrate it back into your business process. I mean, that's, the challenge. So yeah, I think it does depend, right. There's a number of ways you can go about trying to integrate it depending on the number of legacy systems you have if they're API enabled or not, maybe need to bring in other technologies to kind of compliment your models as well. So there's, ways to attack it. And every solution's a little bit different, but I'd say it's much more frequently the challenges to how to integrate that back into the business process, more so than actually creating a model that has our benefit.

Nate Golia: (18:48)
Brian, anything you wanted to add on how to deal with the integrations, any tips you have for people there?

Ryan Rist: (18:53)
No, I think, no. I was thinking more of the non-technical aspects around again, if you're going to implement change into a system, as Tim mentioned, it's training staff, it's hours. And so again, I go back to before you even start make sure you understand what your hypothesis is, the value it's going to bring, because it just makes it easier to get in institutional support. Because at the end of the day, a general manager president has to make trade offs. Do I make this change or that change? And I think you've got to build the case for it.

Nate Golia: (19:25)
We'll start with you then for the second part, which was to talk a little about staffing that's going to be talent. Staffing's going to be a theme throughout this conference. And just wondering, start with you ensures we're looking to build out sort of an AI. I don't want to say practice, but like looking to integrate, where do they, how do they go about finding people or retraining or re-skilling like, where, what is the balance there?

Ryan Rist: (19:47)
Yeah, I think, I don't know if it's being talked about enough talent retention. How does the insurance industry retain data scientists when they're getting offered two, three times as much to go work at tech companies and not every company in insurance is public, so how do you mock or replicate the stock options and ownership that you get at either a startup or a tech company? I think you're going to hear more of that in the next few years, cuz there are a lot of people that are taking jobs in other industries and leaving insurance. So how do we retain them? I think we have a great case for it. I think we have to as people have talked about today, the keynote, particularly we have to create areas where they feel like they can empower to solve problems build new product lines, build new things around maybe the insurance product a data scientist doesn't want to just create models that never see the light of day. They want to engineers at the heart of it, right. Engineers like to build things to get outta the market. So this is just, I dunno what the solution is, but we have a lot of challenges in the industry to keep this very scarce group of talent, which if you believe that data science is going to be the heart of the future of insurance, are we really tackling the issue enough? I don't know if we are,

Nate Golia: (21:03)
Tim. I think you've mentioned like the idea of having insurance expertise as an and technology expertise being important in trying to keep those things together. So it's one of your thoughts on how to bring those things together, how to get, is it easier to teach people who already know insurance AI or the other way around

Tim Carlson: (21:20)
Well teaching insurance for a lot of the people on our team, they have been in different parts of the business. And I think having beyond the insurance knowledge itself, understanding of the political landscape, the different stakeholders understanding the people is a really important part. And I think, part of the advantage being in the insurance industry is a lot of people are kind of make data decisions already or have a lot actuaries. We have the ability to definitely upskill individuals offer new career paths. The word I was thinking about was meaningful work as you're talking. I mean, it's like providing an opportunity to work on projects with stuff that's going to ultimately make it to market actually make an impact. I think showing how what they build can make an impact to our customers and really highlighting those stories. So I think, that's definitely a focus of ours. But I do think it's an advantage when you start to train some of your internal employees that maybe don't have some of those skills with really upskilling them. Cause they're bringing a lot of those other things to the table with some of that knowledge across the business.

Nate Golia: (22:26)
Our last portion of that question was about vendor and partner selection with AI becoming and machine learning because becoming buzzwords, there's a lot of companies we're saying they're doing it. What are your tips for like help? Like when someone comes in and pitches you an idea what are you looking for? What are some of the things that you're looking to hear that they're delivering that tells you there's promise?

Tim Carlson: (22:48)
I think the challenge also like I mentioned earlier is you can have a really good technology solution just figuring out how it's going to integrate it back into the business process and really having a really strong handle of that. Otherwise you're going to be a spot where maybe you have these things that in theory can come back and provide some value. But when you put in practice, it doesn't necessarily work out the way you expect. So I think just figuring out how it's actually going to play with your current world. And then just, we're having just understanding some of the details of it doesn't, if it is truly like an innovation kind of test and learn, like, what are the parameters of that? So, you're not going down a path where many months in or years into a program and expectations out line with what the business is looking for.

Ryan Rist: (23:31)
Yeah. I beyond the obvious when you look for partners and cultural alignment and fit, first of all, I take a very much a build by partner approach. I'll do all three of those, whatever gets the right solution. And we used to build things, everything in house and now very much, we're looking at all three of those, but I think when partnering beyond the obvious I look for where their product roadmaps are headed and where else they might innovate. It takes energy to engage with a partner time resources. And if they might be building a claims prediction model today or a wildfire model today, but if they can branch out into other areas or they're headed into other technologies that are interesting and they're a good partner and they can move quickly, then I can get more done with them. So I think about the runway with a partner in addition to some of the more obvious factors.

Nate Golia: (24:21)
Sure. Well, thanks. Does anyone have any questions for our panel? Be willing to be happy to take a couple? Anyone one look right front, Melissa. Thank you.

Audience Member 1: (24:34)
Hey, I have a bizarre question and it's a bizarre question on how do you think a data science team should be organized and like how many data scientists would they be integrated with product? What would be like the man number of managers compared to engineers and what's optimal and why

Ryan Rist: (24:50)
That's Tim's question.

Tim Carlson: (24:52)
So, the question is how many data scientists aligned to the product managers? I mean, I think it really depends on the complexity of what you're trying to build. I would say absolutely aligned to the product. I mean, I think ultimately you don't want to have data scientists developing off in a corner. I mean we do a good job of really integrating data science with the team, right? It's a kind full stack team that has the ability, not just to build the models, but also deploy, monitor all those things I was talking about previously. I don't know that, I think that the sheer number probably depends on you, how big of the program and what you're trying to ultimately build. But we've definitely had success with integrating the data scientists with our team to deliver products.

Nate Golia: (25:38)
Anyone else have a question just to check the time here? Got about four more minutes? I guess we could break four minutes early. Thank you both so much. I really appreciate doing it. We've got one more session in this track, so please stick around. Thank you.