A Lesson from the Financial Crisis: We Need Better Data Governance

Last week, the U.S. House Financial Services Committee’s Subcommittee on Oversight and Investigations took up the issue of how technology can be employed to improve TARP and financial services oversight. Technology can always be improved upon, of course. But an additional element is needed to better manage things that get out of control—greater trust in the information our systems are generating.

Testimony by Dilip Krishna, a specialist in risk and financial management for Teradata’s financial services and insurance organization, summed up the challenge facing an industry built on trust: trust breaks down in the absence of transparency. We have plenty of technology for slicing, dicing and parsing data, but that’s not going to prevent another meltdown such as we saw at the end of 2008.

Obviously, we can’t have complete, open transparency to all outsiders; we need to maintain confidentiality of customer records. But the trust factor is key.

“Trust lies at the heart of transparency,” Krishna said. “It is only in unusual circumstances, or at very high cost, that financial information can be demonstrated to be completely authentic.”

Being able to trust the data or information on which we base decision-making is a vital part of this formula. Krishna called this process the “information assembly line,” in which “data needs to be complete and detailed while it is transformed into useful information as it moves from the transaction systems to the point of disclosure. Confidence in the reported information can only be gained when there is confidence in the robustness of the assembly line—for example, via knowledge that all changes during the process of creating the information are fully audited and controlled.”

How does an organization go about assuring that their “information assembly line” is delivering the most credible and timely information possible? The answer is data governance.

In a new book, The Data Asset: How Smart Companies Govern Their Data for Business Success, Tony Fisher, president and CEO of DataFlux, a part of SAS, writes that “data governance and data quality should never be considered a one-time project. A quality culture must be established as an ongoing, continuous process.” A strong, sustainable data governance program assures transparency, he adds. And at this time, too few organizations approach data governance this way.

Achieving a proactive state of data governance takes time and needs to be approached in incremental stages, Fisher says. He recommends establishing a data governance center of excellence. Additionally, “business analysts, working through data stewards, start to control the data management process, with IT playing a supporting role.”

Organizations striving for effective data governance, he says, “need to think less about functions, less about data, and more about processes. The mark of a proactive organization is that that it makes “information management decisions based on the need to improve the business rather than the need to improve the IT infrastructure.”

As financial organizations continue to repair themselves in the wake of the financial crisis, perhaps its time to establish stronger governance over the creation and management of the data that is now driving these businesses.

Joe McKendrick is an author, consultant, blogger and frequent INN contributor specializing in information technology. He can be reached at joe@mckendrickresearch.com.

The opinions of bloggers on www.insurancenetworking.com do not necessarily reflect those of Insurance Networking News.

For reprint and licensing requests for this article, click here.
Analytics Security risk Compliance Policy adminstration
MORE FROM DIGITAL INSURANCE