Launching a new coverage line? Moving accounts off the mainframe? Taking over a smaller company?
Let's not mince words: data integration is hard work. There are a lot of data sources that need to be vetted, cleansed, transformed and brought into the fold.
Perhaps there is a way to establish a system flexible enough to absorb any and all new data that comes its way. In
What's required is an adaptable architecture — something we call a next-generation data integration architecture — which can grow and change, while enabling “one-click integration” as new data sources are brought in.
John Schmidt, who is also co-author of "
For example, data integration becomes a business process, not an IT activity. This is an important step to achieving more important value from integration activities. IT still plays a key role as an enabler of the process, however. The key is that data integration is baked into the business and is fairly automated, rather than being a special or one-off activity that needs to be started from scratch every time a new data set is brought into the business. Rather, data integration needs to be a repeatable process that occurs almost without any prompting.
John borrows a term from management guru Jim Collins, describing the elevation of data integration to an enterprise process as a “Big Hairy Audacious Vision.” As he
Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.
Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at
This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.
The opinions of bloggers on