When Data Integration Isn't Enough

For many carriers, data integration has been a holy grail for years, and there are many successful outcomes to show for all the work that's gone into it. But it can also be clunky and time-consuming. Now, emerging strategies such as virtualization, cloud- and service-oriented architectures promise to take things to a whole new level.

That's the gist of Judith Davis and Robert Eve's latest book, "Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility." While integration has been about cobbling production systems together to deliver point-to-point solutions, “data virtualization” is a way to pull data from various sources into an abstracted service layer, removed from the live databases. This helps cut down the need for physical storage, and provides a common interface for all applications using the data, especially BI, analytics and transaction systems.

Davis and Eve relate a number of case studies, including the experience of Northern Trust, a leading provider of outsourced financial services. The problem Northern Trust was running into was that its Investment Operations Outsourcing unit could not on-board corporate customers fast enough for its outsourced investment management operations. As only a finance-savvy company could understand, it was losing revenue in the delays in getting its services up to speed.

One of the biggest obstacles, Davis and Eve relate, is that it took too much time for Northern Trust to set up client-side reporting functions for customers so that it could provide data, performance and valuation reports. Such information was stored across multiple, separate databases, including mainframe systems, according to senior vice president Leonard Hardy, quoted in the book saying. “The old way of reporting required technology intervention for every single customer implementation... each institution has its own formats, labels, graphics and icons.” You can imagine all the legwork involved when a new client came on board.

To address this, Hardy's team put a virtualized data warehouse platform in place combined with a new client reporting front end. Data from all relevant sources is now abstracted into the outsourcing unit's data stores and “creates a unified virtual view that makes them appear as a single physical data store.”

The bottom line, Davis and Eve report, is that Northern Trust was able to transfer time-consuming report customization work that previously had to be done for each new institutional customer—by a very high-priced and very busy IT department—to the company's operations partners, which includes business analysts. The company's outsourcing unit simply didn’t have enough programmers and IT staff to go around.

“We have been successful in our goal of taking the technology out of the outsourcing equation,” Hardy is quoted as saying. The solution helped the company achieve a 50-percent reduction in time to market and a 200-percent ROI.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.

Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at joe@mckendrickresearch.com.

This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

For reprint and licensing requests for this article, click here.
Workforce management
MORE FROM DIGITAL INSURANCE