Data virtualization is seen as the answer to many of the woes that have afflicted data management over the years, from the expenses of data warehousing to information being siloed across departments and systems to the challenge of multiple formats.
To a large extent, the ability to abstract key data sources within a highly accessible service layer will pave the way to solving many of these issues. However, data virtualization isn’t a simple technology fix that happens overnight, or when a single vendor's solution is dropped into the data center. Rather, it is part of a strategy that the business and IT departments need to plan out together.
In their latest book, "
Centralize responsibility for implementing data virtualization. Key to this is to implement a common data model to ensure consistent, high-quality data, which also makes business users “more confident in the data and IT staff more agile and productive.”
Educate the business on the advantages of virtualization. “Allocate time to consult with business users and make sure they understand the data,” Davis and Eve recommend — and be prepared to provide support."
Pay attention to performance tuning and scalability. This is a critical success factor for any and all virtualization efforts, Davis and Eve said. Also, they say, “accommodate the fact that users are unpredictable on ad-hoc analysis and reporting.”
Take a phased approach to implementing data virtualization. Don't try to boil the ocean all at once. Start with one department and build out from there.
Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.
Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at
This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.
The opinions of bloggers on