Follow the Data When Tracking Systemic Risk

The details of an Obama Administration financial reform bill are still being ironed out. But Wall Street operations executives are taking notice of one core element—a proposal to create a systematic risk regulator.

"Such an initiative will mean that firms need a greater transparency into their decision-making process and an enterprisewide view of their risk," one operations executive at a Wall Street bulge bracket firm told Securities Industry News last week. "Firms will need to understand the decision-making process and have a solid audit trail."

Unclear still is just who is to blame for the financial crisis. Quants, risk managers, chief executives and even regulators are all taking heat. But one thing is clear: firms didn't or couldn't pull together information that would allow their own executives to analyze risks across their various units. Or regulators to see a holistic view of what was going on.

Data—and the risk calculations they allow—were managed at the business-line level. So boards of directors at many firms had an incomplete and misleading view of the aggregate market, credit, counterparty, liquidity and collateral risks they faced, among others.

"It's not simply about complying with legislation. Firms haven't solved the data issue yet, which is what hindered transparency in the past," says Michael Atkin, managing director of the Enterprise Data Management Council, a New York trade group specializing in promoting effective data management. "Firms will need to collect data across different parts of the business and ensure it's not only accurate, but consistent as well."

The EDM Council has teamed up with roughly 60 Wall Street executives, academics, financial analysts and former regulators to promote the creation of a National Institute of Finance. The institute would collect an array of data from the back-offices of major institutions as well as corporate and municipal issuers on a daily basis. The NIF would then perform "what-if analysis" to determine how the failure of one financial institution could affect the remainder of the financial system. Such an assessment would move up to the so-called systemic risk regulator.

So just what data is necessary and how can a firm keep track of it? Transactional and position data, for starters. That is, the types of securities a firm has traded, in what quantities, with whom and when. But this is just the tip of the iceberg. There is also descriptive and pricing information on financial instruments and information on counterparties-trading partners and customers, who owns them and who is responsible for paying off their obligations in the event they go bust.

Middle- and back-office systems will have to keep track of much of the position and reference data (see chart). Using front-end automated trading platforms and other compliance packages will definitely go a long way to providing easy access to transactional data, for instance.

Portware, a New York-based global multi-asset electronic trading platform, for one, says that its smart order-routing system provides a snapshot of all the market data from multiple liquidity pools needed before a trading decision is made and retains the data for auditing purposes.

In May, HedgeOp Compliance, a provider of compliance software and consulting services for asset managers, released Compliance Trak Adviser Platform, a tool designed to help small- and mid-sized registered investment advisers comply with any regulatory requirements. The software also keeps track of who bought what securities when and whether a user violated any trading guidelines.

Will these kinds of software packages be enough to satisfy the need for regulators to look underneath a firm's kimono and understand just what is going on.

Brains will also be needed. Critical will be deciding who will be responsible for ensuring which data a firm keeps, according to Atkin. In fact, some firms may eventually decide to designate a chief data officer to manage a team of data specialists, such as data modelers, data stewards and taxonomists. The specialists, in turn, will figure out what data is stored where, verify its accuracy, build data models and create and attach metadata. The data officer and the team would decide who may access what data, integrate databases, and maintain archived data to meet regulatory requirements.

Most firms so far are maintaining the status quo-keeping different pieces of data in many different databases.

Often, according to Atkin and other data experts, this can amount to as many as 30 different, somewhat redundant databases with discrepancies and errors, as well as 30 or more applications that consume reference data and incoming data feeds from over a dozen data vendors.

Such a hairball for collecting data just won't cut it any more.

"How a firm reacts to the need for accurate, timely, integrated data is now a critical differentiator in a competitive environment," warns Dayle Scher, a senior analyst with TowerGroup, a Needham, Mass-based research shop. "Financial firms must be prepared to respond to increased due diligence on the part of investors and certainly more restrictive regulations looming."

After ensuring the data is accurate, says Scher, firms will need to share that data with downstream applications such as order management and portfolio accounting software and staff that need it, in the right formats and in a timely manner.

Neither of the two alternatives now often used-either a transaction-based extract, transform and load operation or a storage-based data warehouse appear to be an ideal solution.

"The transaction-based ETL approach makes it easier to add new data sources and feeds than a storage-based solution," explains Scher. "However, the ETL approach doesn't support the storage of historical data and cannot support complex reporting requirements."

Likewise, according to Scher, a storage-based data warehouse does collect historical and current data needed for complex reporting. But is difficult to add new data feeds and requires huge processing resources.

The bottom line, according to Scher is that a firm needs to use transactional data to support its day-to-day operations, while also analyzing historical data-without reducing the performance of the systems involved.

"Compliance departments need to be able to assess instantly the adherence to client guidelines at the same time the chief operating officer analyzes the impact of bad data on a firm's back-office operations," says Scher. "Data modeling activity that also supports trading decisions must coexist with customer data that supports client services."

A "third solution" could be akin to that developed by Asset Control, a New York and London-based data management software vendor. The firm's AC Invest product allows firms to create a single or multiple "golden copies" of data by allowing the separate business units to manage and refine the rules governing normalization of the data.

"Business users can easily see what data is held on each asset or financial instrument and where it comes from through their standard internet browser," says Jean Williams, product business owner for Asset Control. "The data model allows business users to easily create the reports they need quickly using spreadsheets and off-the shelf reporting tools. They can also see what data is held on each asset or financial instrument and where it comes from."

However, business users-employees within different departments-will be restricted in just what type of data can be altered. When it is accessed, the system will retain a full audit trail of just who or what process changed any aspect of the data and when.

This story has been reprinted with permission from Securities Industry News.

For reprint and licensing requests for this article, click here.
Analytics Data security Core systems Data and information management Policy adminstration Security risk
MORE FROM DIGITAL INSURANCE