Can you put a price on good data? Intuitively and logically, we know it's important to have the cleanest, deduped, well-managed data possible. Some researchers actually did come out with some calculations and financial impacts of data and found that maintaining good data could translate into a 10% reduction in processing time, which in turn can translate into potentially millions, if not billions saved for organizations.
The study was conducted by researchers from the
The researchers say that taking steps to increase “data usability,” meaning making it more accessible and to users and of higher quality, can go a long way to lowering the additional subsequent processing time needed at the back end.
The Sybase study looked at industry sectors, and estimates that insurance companies could realize a 105% return on equity (ROE) as a result of a 10% improvement in data quality and mobility. ROE is defined as net income/shareholder equity, and an important indicator of a business’s ability to grow. “As sales forces become increasingly mobile, it is imperative for competitiveness that they have high quality data and IT systems that enable rich interactions with customers,” the study said.
Data quality is a critical issue that insurance companies are starting to come to terms with. Jason Tiret, director of modeling and architecture solutions for
“Poor data quality leads to a gross misunderstanding of your data, and misunderstanding your data is a bit like driving blindfolded,” he says. “You may eventually get to where you need to go, but it is going be very risky and very costly.”
What's the best way to go about achieving data quality? Don't try to tackle your entire data store at once, Tiret advises. “Focus on the high-visibility, business-critical areas first. This will provide immediate value and, more importantly, demonstrate success that will gain momentum and buy-in to focus on other areas. If that is claim information and product codes, focus on that. If that is customer and their contact mechanisms, focus on that.”
Tiret also advocates a “thorough understanding of the data and the structure of the data—technical metadata.” To reach this understanding, a data model or data flow model is essential.
“Involving the subject matter experts and business will help gain understanding of the data and its accuracy according the business” he says. “It is easy, technically, to see if only dates are stored in a date column. It is much more difficult to understand that a claim submission date must be on or after the incident date.”
Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology.
Readers are encouraged to respond to Joe using the “Add Your Comments” box below. He can also be reached at
The opinions of bloggers on