In a recent post, I talked about the challenges “Big Data” poses to insurance companies. Now, I'm going to talk about one of the most promising strategies for managing these terabytes' worth of data.
That is, Master Data Management (MDM), in which organizations maintain and reference a single, “gold copy” of data versus attempting to aggregate information from across silos, divisions and networks.
MDM, far from being a cognitive exercise by data managers and DBAs, has a direct and substantial impact on the competitiveness of the business. In preparing an article on the subject for the latest edition of INN (click here for the story), I heard from Joshua Schwartz, director at PwC's Diamond Advisory Services, who explained MDM is the best recourse to unlock siloed data, often associated with multiple applications for each single line of business:
“For many insurance processes, especially first notice of loss, every second counts,” he said. “Master data can really help in searching and retrieving existing customer records, or pulling in associated claimants to a party among people, companies, vendors or third parties, which we already know saves time up front. This accelerates the processes for adjudicating and processing claims. The time spent by your claims handler or servicing agent to continually repeat and input data not only creates data quality problems downstream, but takes time when servicing customers.”
Schwartz provides three key elements that should be part of an MDM approach:
1. Address organizational issues by identifying an owner for master data assets and for each domain across business functions; and develop a data stewardship program to enforce consistency and reusability for data delivery and use.
2. Create common and shared processes for defining, administering and governing master data by establishing a formal governance process and decision-making guidelines for master data.
3. Employ technology to establish a single version of master data containing uniquely identified records from various business and information domains. Technology should include an integration platform to support the movement of master data between operational systems and across external entities. Technology also needs to enable the establishment of business rules that define the “trustworthy-ness” of data sources, and the attributes to develop a golden view of data.
Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.
Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at firstname.lastname@example.org.
This blog was exclusively written for Insurance Networking News. It may not be reposted or reused without permission from Insurance Networking News.
The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.
Register or login for access to this item and much more
All Digital Insurance content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access