CDOs Must Build Bridges, Not Silos
"Market data and reference data functions are, if not unifying into a single unit, converging under a common structure."

The separation of market and reference data functions that took place over a decade ago was not without good cause: Until then, reference data was largely managed within a firm’s market data function, and didn’t get the recognition and support from senior management that it deserved. If memory serves me correctly, poor reference data was the number one cause of failed trades—a cost that has been eliminated to a large degree as a result of vendors like Markit creating products such as the Reference Entity Database, and firms getting their houses in order, creating internal “golden copies” of securities master data, and critically, translating those failed trades into a profit-and-loss (P&L) argument that alarmed senior managers enough to set aside separate budgets for reference data projects, processes and staff.
However, the challenges associated with the sheer volumes and complexity of data now being captured, processed and monitored by financial firms mean that data is a much bigger challenge than in previous years, and therefore accounts for a larger share of budget, and has inherent in it higher levels of risk. To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management—from market data to reference data, from data held in internal documents to confidential client data. To perform these roles successfully, CDOs must work hand-in-hand with various other departments, from operations to trading functions. And, of course, they have direct oversight of the most data-intensive areas of all—their market data and reference data departments.
As a result, market data and reference data functions are, if not unifying into a single unit, converging under a common structure. This became more evident than ever at the recent European Financial Information Summit in London, where market data professionals were as concerned about provenance as about prices, and about legal entity identifiers (LEIs) as much as latency, and where reference data experts were as concerned about real-time changes to information as they were about traditional static data.
This convergence also exists beyond the world of end-user firms: For example, enterprise data management software platform vendor GoldenSource has fully integrated its market data management module with its core suite of EDM capabilities. According to the vendor, this will not only help centralize overall data management, but will also make it easier to add coverage of new datasets, and to manage a firm’s response to regulatory requirements—such as the Fundamental Review of the Trading Book (FRTB) proposals from the Bank for International Settlements’ Basel Committee on Banking Supervision, which will take effect in 2019, which GoldenSource managing director of sales and client operations Neill Vanlint says “will change forever the way that risk and finance manage data”—centrally, where those regulatory demands require access to market data and reference data.
To be sure, another reason for this change is the increasingly strict and burdensome regulatory environment. This is not to say that firms believe bringing these groups closer will directly save money, but rather that by creating closer ties between all data assets—and the people, systems and groups that govern them—they will be better placed to obtain a single and more accurate view of their data, and will also therefore be better placed to respond quickly and accurately to regulatory reporting demands from regulators, minimizing both fines and the cost of providing this function.
To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management
It seems that as we see greater divergence of datasets themselves as new types of data evolve, and others are separated from one another for practical and budgetary purposes, it will become even more important that the management functions that govern that data must do the opposite, and converge in order to manage this ever-broadening array of data assets.
More on Emerging Technologies
Google gifts Linux, capital raised for Canton, one less CTP bid, and more
The Waters Cooler: Banks team up for open-source AI controls, S&P injects GenAI into Capital IQ, and Goldman Sachs employees get their own AI assistant in this week’s news roundup.
Numerix strikes Hundsun deal as China pushes domestic tech
The homegrown tech initiative—‘Xinchuang’—is a new challenge for foreign vendors.
RBC’s partnership with GenAI vendor Cohere begins to bear fruit
The platform aims to help the Canadian bank achieve its lofty AI goals.
Deutsche Bank casts a cautious eye towards agentic AI
“An AI worker is something that is really buildable,” says innovation and AI head
TMX buys ETF biz, Iress reinvests in trading tools, UBS data exposed, and more
The Waters Cooler: Euroclear’s next-gen service, MarketAxess launches e-trading for IGBs, and new FX services are in this week’s news round-up.
SEC pulls rulemaking proposals in bid for course correction
The regulator withdrew 14 Gensler-era proposals, including the controversial predictive data analytics proposal.
Waters Wavelength Ep. 322: Navigating air travel and cybersecurity
This week, Reb, Nyela, and Shen discuss concerns around air travel and notable cybersecurity incidents.
Cloud offers promise for execs struggling with legacy tech
Tech execs from the buy side and vendor world are still grappling with how to handle legacy technology and where the cloud should step in.