In conversations with insurers globally, we at Celent are hearing of a new approach to analytics. It's not called big data, but a different approach, one that seeks to leverage data far more quickly and be more tolerant of the errors in the data. A shift toward the idea that all data is useful is occurring, but baby steps so far—I still often hear about truth, fact and consistent data in discussions.

When thinking about data, this idea of truth has always bothered me—the idea that system data represents the facts, or the unassailable truth. One of the key activities in establishing classic analytics processes is establishing which data is the truth, there are always arguments about which data is accurate and can be trusted. In this process, inaccurate data (or data that doesn’t contribute to this truth) is ignored, removed or lost. This leads to a negotiation process, the end result of which is often called the single version of the truth—i.e. the output of a report that all stakeholders agree to. The strange thing about this process is that it observes that there are multiple viewpoints, but seeks a single truth regardless. Relational database design and modern user interfaces push us toward this line of thinking, where there is only one field to fill in, one answer to each question after all.

Register or login for access to this item and much more

All Digital Insurance content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access