Despite what many insurance companies may think and desire, they cannot develop all of their systems and solutions in-house. Fraught with the typical pitfalls of overrunning budgetary and time constraints, to bogging down an already overworked IT staff, many carriers eventually decide to look elsewhere to meet their ongoing technology needs. Because of this, it’s a given that both the carrier and the technology vendor actively seek the best possible working relationship.
While maximizing the partnership is always important, one aspect of the standard vendor/carrier relationship that can be especially problematic, according to both sides, is the beta testing process.
“There are two kinds of beta tests,” says Matt Josefowicz, director of New York-based research firm, Novarica, “one that’s part of product development for the vendor where the solution is not yet for sale, and the other is almost a proof of concept, or an internal beta test as part of a sale and implementation process.”
In either case, integral to the successful deployment of any new solution or system, beta testing can be a white-knuckle process, with harried analysts, project managers and business and IT personnel on both sides working feverishly together in an attempt to find all possible bugs, problems and concerns in a carefully specified timeframe. The carrier and vendor know the importance of the testing equally, as, in many cases, the technology being tested could be the replacement of a core system, which, if not tested properly, might make or break not only the relationship, but also the companies.
So how do carriers and vendors make this relationship work? While it may seem obvious, the key to it all is communication.
“Communication is paramount,” says Bill Zimmerman, manager of information systems at Illinois Mutual, Peoria, Ill., which is one of a select few beta partners of Perceptive Software Inc., testing the Shawnee, Kan.-based vendor’s ImageNow document imaging software. “That was the key to our success in this project.”
Perceptive Software has a beta program available in which a small number of organizations per market segment are invited to participate. Vital to that process, says Zimmerman, is constant communication and constant feedback that includes both system and user documentation.
The vendor agrees. “It all comes down to good communication,” says Sascha Ohler, senior product manager, Perceptive Software. “In any beta scenario, you always rely on the back-and-forth between the vendor and the customer, so you want to make sure the way you communicate, and the methods being used for communication, are well thought out long before the beta program.”
Another insurer, Penn National Insurance, a property/casualty insurance company in Harrisburg, Pa., recently purchased a ratings management program from Frisco, Texas-based Skywire Software that it plans to use to eventually take over as its universal ratings engine for both its personal and commercial lines.
“The most important thing is constant communication,” says Dean Kimball, Penn National’s project manager. “We’ve found that if we’re not talking to them multiple times per week about workplans, issues and deliverables, we can get off track.”
Kimball says that during the unit test phase with Skywire, he was in contact with them almost daily. Skywire’s business analyst contacted him right away if there were any issues with the test cases. The two sides also participated in weekly project status meetings, and Kimball had weekly discussions with the corresponding project manager on the Skywire professional services team.
“It’s pretty typical in working with vendors that we generally have at least a weekly conference call with an account exec or project manager,” says Helena Vendrzyk Gordon, director of projects and planning for Penn National. “The people within the project team will be in daily contact and, in many cases, we’ve had people from the vendor located here for a period of time.
“With Skywire, we didn’t do this because it’s not typical of their methodology,” Gordon continues, “but the method we’ve used with them—Skywire having our test cases and validating them prior to hand-off back to us—has worked very well. They came on-site for a few days of meetings, but most of the work has been done off-premises. I think this has worked very well, and has helped reduce expenses.”
Having previously gone through the first major replacement of a core system in about 25 years by adding a claim processing system from San Mateo, Calif.-based Guidewire Software Inc., and currently in the early stages of beta testing the vendor’s new ClaimCenter 5.0 offering, Bill Garvey, IT director with Main Street America Group, Jacksonville, Fla., believes in the importance of communication to making the testing phase work its best.
During the initial implementation, Garvey wanted someone at Guidewire to walk him through the process—to make him feel as though they were just as invested in the project as Main Street America—and he says he got that from his vendor.
“I felt like I could have called anyone at the company and they would have come running to the rescue,” Garvey says. “I hope they maintain that attitude, but that’s what I expected and that’s what I got. I’ve been involved in other projects where I didn’t get that support—where they say they want you to be successful, but you feel like you’re on your own ... I know it sounds like happy, fuzzy relationship type stuff, but that’s what it is, and that’s what’s important.”
CHALLENGES WITH FEEDBACK
Even with the best communication between carriers and vendors, there are still plenty of challenges that can arise during beta testing.
One such challenge in every testing environment is the feedback gathering process and, concurrently, structuring those of lines of communication so there isn’t a stratification of feedback where seemingly lesser comments may be disregarded or buried in the queue.
“I think the biggest challenge is to structure the beta test properly to make sure there are clear mechanisms to gather user feedback and to prioritize the identified issues in terms of phasing any enhancements or modifications that need to be made based on them,” Josefowicz says. “You have to be careful that you don’t just respond to the users who complain most loudly, but make sure each user-identified issue is prioritized and evaluated strategically rather than just going into a reactive mode and just fixing everything users complain about.”
Allotting enough time for testing is another major concern. Carriers need to ensure they plot enough time for all the different phases of testing.
According to Penn National’s Kimball, this includes putting together a test matrix outlining what needs to be tested, and allocating enough time to match everything in the test matrix, which includes the expected results for each test case.
The time factored in to include necessary coding updates and other problems found during testing is important.
Bill Garvey agrees. “Once upon a time, we had a disdain for the Q/A process in regard to time,” he says. “But now, I’m a firm believer that the Q/A process is more important than any other part in the lifecycle of a project. The time that it takes to do these things should not be dismissed. You should take your time to do it right.”
Going along with this is the creation of test cases for use during the beta process, which also takes a great deal of time. Kimball managed three different testing environments where, after successfully running through the first handful of test cases, he migrated them to his systems test environment, where a business support team went through a larger number of test cases focused on different rating combinations.
“One challenge,” Kimball says, “is the amount of detail and work that needs to go into the test cases, especially building the expected results, which act as a guide for the business support team that does the testing to know whether each test case is successful or not.”
Another issue most carriers identified surrounds the inability to test absolutely everything during the beta phase. While both carriers and vendors would like to be able to test for absolutely every scenario, given the usual weeks-to-months timeframe of the typical beta testing process, insurers need to test as much as they can, but be sure to focus on what’s most important to their business.
“Take the time to think about where your risk points are and craft a testing strategy that addresses those risk points, says Gordon. “With a complex system, you’re never going to be able to test everything, so you want to ensure you test where your risk really is and craft a test matrix and a test case that addresses those risk points without being onerous.”
Despite the right test strategy, it’s still possible for things to not turn out as expected once the product is implemented.
“You can’t possibly test everything,” Garvey says. “One of the things we learned right away after going live was what we didn’t test was the exuberance of the claims adjusters of attaching documents to the files. They used our claims system as a document management system, and we found it took a hit on performance a few months out of the chute. Eventually we did some re-architecting of the infrastructure, but we never even thought about this — we didn’t think to test for an overload of documents, but we will the next time.”
In the end, according to Illinois Mutual’s Zimmerman, companies on both sides of the fence must ensure they make the proper resources are available during the testing phase, as well as a willingness to actively participate. For carriers, he thinks that beta testing is a process that needs to be closely managed and, sometimes, managed differently from day to day to work well.
Another piece of advice from Guidewire’s VP and co-founder James Kwak is that testing is not something that you should just think about at the end. In all cases, both carriers and vendors should be thinking about testing throughout the beta process and, ideally, to be testing as much as possible as they go along.
“It’s not just enough to spend lots of man-hours at the end of the day trying to find problems in software,” he says. “You have to think all along the way as you design and implement it about how to make it testable and how you’re going to test it.”
Find more about maintaining vendor relationships by searching “Forging Lasting Alliances” at www.insurancenetworking.com.
(c) 2008 Insurance Networking News and SourceMedia, Inc. All Rights Reserved.
Register or login for access to this item and much more
All Digital Insurance content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access