How to Define Insurance IT Success

As we discussed, vision statements for insurance business transformation and core system replacement projects significantly increase the probability of success. But is that enough? It can be difficult for teams to operationalize vision statements. Just as agile development teams have found that acceptance criteria can help to define requirements and guide development, success measures can help to define visions and guide decision making critical to the overall success of these projects.

Many of the project teams I have talked to define success by the traditional “on time and “on budget” (OTOB) measures. Whether they actually hold themselves accountable to this or any measure is the topic for another day. Is OTOB a good or true measure for a team? This measure has driven teams towards cutting scope, implementing the lowest cost features rather than the highest priority features. If you hold teams accountable for delivering a defined scope on time and on budget, the measure tends to drive short cuts and negatively impact quality. Teams with only this measure inevitably end up with a significant backlog of functionality and tech debt that may or may not be funded and addressed. While financial controls are necessary and responsible, they reflect the cost-orientation of many IT organizations and fail to measure delivered value.

So what should success measures in the insurance industry look like? A good place to start is the project benefits cited by carriers in a recent Novarica survey. These categories are heavily weighted towards internal processes but do include some external measurements like ease of doing business for our distributors.

As I mentioned in my previous piece, customer experience focus is often lacking in vision statements and in success metrics due to the difficulty in measurement. It is usually measured indirectly through growth metrics.

These benefits were often measured subjectively. Our research did not identify whether quantified goals were established prior to initiating the project or how the impact was measured. However, it seems reasonable that if the goals of the project were to achieve one or more of the above stated benefits, then measurement of the outcome should be part of the equation for the success measure. So what should your success measures look like?

At Novarica, we believe, success measures should be:

• Outcome-oriented

• Balanced

• Easy to understand, easy to verify

• Tied to key business strategies i.e. important.

• Leading rather than lagging indicators

Let’s take the example of a Consumer Portal implementation for a specialty company program. Assume that the company is not just replacing obsolete technology, but desires to grow revenue and improve retention by creating a superior policy holder experience. The vision statement should give some clarity as to the types of interactions, source of leads, and degree of self-service that is targeted. It may be general enough to define the vision or objectives as “To create a user / client community which becomes the preferred source for self-service and for safety, loss reduction, and related services for the targeted program industry.”

Success measures for solutions that target incremental new client revenue generated from this site would reinforce the strategic objective, but may be too much of a lagging indicator to provide feedback to the development team. To provide initial feedback on the customer experience, carriers should engage in focus groups and A/B testing of design approaches to better understand the reaction of actual customers. The reaction of carrier-based testers to the proposed solution is not enough. These techniques have been used in other industries for many years but are rarely applied to insurance solutions. Website analytics for ease of use and consumer behavior are also extremely helpful in improving the site and achieving the vision. This type of design feedback loop is considered a best practice and should be implemented. However, while critical to the overall attainment of the objective, they are too detailed and tactical to be declared success measures.

Success measures for this vision could include quantification and evaluation of

• Omni-channel transitions and drop rates. Is the transition smooth, timely and efficient when insureds transfer from the portal to the agent or the call center?

• Self-service – measuring the change in volume or percent of self-service transactions from agents, current call center, or service systems to the new portal as an indicator of ease of use

• Content access – Is there an increase in access and utilization of content?

• Consumer and agent feedback.

• Implementation of community-suggested features and enhancements – are you engaging your customer base, and are they helping to build out the experience by contributing content or suggestions?

Defining success measures at this level means that carriers must determine how they will capture the level of information required as part of their solution design process. There are many analytical tools that can be invoked to capture customer experience related data as a byproduct of the process without being invasive or requiring feedback by the customer. The owner of this capability should have detailed data on when customers visit the site, what they are attracted to and where they leave the system. This detail data will help the owner refine the user experience and lead to the desired goals.

You can’t measure improvement and success without a baseline. Many carriers do not know the baselines for these attributes. Just like an exercise program, you need to understand where you are starting from in order to measure and celebrate success. Establishing a baseline for future success measures can also lead to a better understanding of current issues and help to prioritize features or identify quick hit fixes to current capabilities.

For reprint and licensing requests for this article, click here.
Core systems Policy adminstration
MORE FROM DIGITAL INSURANCE