CDOs Must Build Bridges, Not Silos

"Market data and reference data functions are, if not unifying into a single unit, converging under a common structure."

max-bowie
Max Bowie, editor, Inside Market Data

The separation of market and reference data functions that took place over a decade ago was not without good cause: Until then, reference data was largely managed within a firm’s market data function, and didn’t get the recognition and support from senior management that it deserved. If memory serves me correctly, poor reference data was the number one cause of failed trades—a cost that has been eliminated to a large degree as a result of vendors like Markit creating products such as the Reference Entity Database, and firms getting their houses in order, creating internal “golden copies” of securities master data, and critically, translating those failed trades into a profit-and-loss (P&L) argument that alarmed senior managers enough to set aside separate budgets for reference data projects, processes and staff. 

However, the challenges associated with the sheer volumes and complexity of data now being captured, processed and monitored by financial firms mean that data is a much bigger challenge than in previous years, and therefore accounts for a larger share of budget, and has inherent in it higher levels of risk. To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management—from market data to reference data, from data held in internal documents to confidential client data. To perform these roles successfully, CDOs must work hand-in-hand with various other departments, from operations to trading functions. And, of course, they have direct oversight of the most data-intensive areas of all—their market data and reference data departments.

As a result, market data and reference data functions are, if not unifying into a single unit, converging under a common structure. This became more evident than ever at the recent European Financial Information Summit in London, where market data professionals were as concerned about provenance as about prices, and about legal entity identifiers (LEIs) as much as latency, and where reference data experts were as concerned about real-time changes to information as they were about traditional static data.

This convergence also exists beyond the world of end-user firms: For example, enterprise data management software platform vendor GoldenSource has fully integrated its market data management module with its core suite of EDM capabilities. According to the vendor, this will not only help centralize overall data management, but will also make it easier to add coverage of new datasets, and to manage a firm’s response to regulatory requirements—such as the Fundamental Review of the Trading Book (FRTB) proposals from the Bank for International Settlements’ Basel Committee on Banking Supervision, which will take effect in 2019, which GoldenSource managing director of sales and client operations Neill Vanlint says “will change forever the way that risk and finance manage data”—centrally, where those regulatory demands require access to market data and reference data.

To be sure, another reason for this change is the increasingly strict and burdensome regulatory environment. This is not to say that firms believe bringing these groups closer will directly save money, but rather that by creating closer ties between all data assets—and the people, systems and groups that govern them—they will be better placed to obtain a single and more accurate view of their data, and will also therefore be better placed to respond quickly and accurately to regulatory reporting demands from regulators, minimizing both fines and the cost of providing this function.

To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management

It seems that as we see greater divergence of datasets themselves as new types of data evolve, and others are separated from one another for practical and budgetary purposes, it will become even more important that the management functions that govern that data must do the opposite, and converge in order to manage this ever-broadening array of data assets. 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Systematic tools gain favor in fixed income

Automation is enabling systematic strategies in fixed income that were previously reserved for equities trading. The tech gap between the two may be closing, but differences remain.

Why recent failures are a catalyst for DLT’s success

Deutsche Bank’s Mathew Kathayanat and Jie Yi Lee argue that DLT's high-profile failures don't mean the technology is dead. Now that the hype has died down, the path is cleared for more measured decisions about DLT’s applications.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here