CDOs Must Build Bridges, Not Silos
"Market data and reference data functions are, if not unifying into a single unit, converging under a common structure."
The separation of market and reference data functions that took place over a decade ago was not without good cause: Until then, reference data was largely managed within a firm’s market data function, and didn’t get the recognition and support from senior management that it deserved. If memory serves me correctly, poor reference data was the number one cause of failed trades—a cost that has been eliminated to a large degree as a result of vendors like Markit creating products such as the Reference Entity Database, and firms getting their houses in order, creating internal “golden copies” of securities master data, and critically, translating those failed trades into a profit-and-loss (P&L) argument that alarmed senior managers enough to set aside separate budgets for reference data projects, processes and staff.
However, the challenges associated with the sheer volumes and complexity of data now being captured, processed and monitored by financial firms mean that data is a much bigger challenge than in previous years, and therefore accounts for a larger share of budget, and has inherent in it higher levels of risk. To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management—from market data to reference data, from data held in internal documents to confidential client data. To perform these roles successfully, CDOs must work hand-in-hand with various other departments, from operations to trading functions. And, of course, they have direct oversight of the most data-intensive areas of all—their market data and reference data departments.
As a result, market data and reference data functions are, if not unifying into a single unit, converging under a common structure. This became more evident than ever at the recent European Financial Information Summit in London, where market data professionals were as concerned about provenance as about prices, and about legal entity identifiers (LEIs) as much as latency, and where reference data experts were as concerned about real-time changes to information as they were about traditional static data.
This convergence also exists beyond the world of end-user firms: For example, enterprise data management software platform vendor GoldenSource has fully integrated its market data management module with its core suite of EDM capabilities. According to the vendor, this will not only help centralize overall data management, but will also make it easier to add coverage of new datasets, and to manage a firm’s response to regulatory requirements—such as the Fundamental Review of the Trading Book (FRTB) proposals from the Bank for International Settlements’ Basel Committee on Banking Supervision, which will take effect in 2019, which GoldenSource managing director of sales and client operations Neill Vanlint says “will change forever the way that risk and finance manage data”—centrally, where those regulatory demands require access to market data and reference data.
To be sure, another reason for this change is the increasingly strict and burdensome regulatory environment. This is not to say that firms believe bringing these groups closer will directly save money, but rather that by creating closer ties between all data assets—and the people, systems and groups that govern them—they will be better placed to obtain a single and more accurate view of their data, and will also therefore be better placed to respond quickly and accurately to regulatory reporting demands from regulators, minimizing both fines and the cost of providing this function.
To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management
It seems that as we see greater divergence of datasets themselves as new types of data evolve, and others are separated from one another for practical and budgetary purposes, it will become even more important that the management functions that govern that data must do the opposite, and converge in order to manage this ever-broadening array of data assets.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
Waters Wavelength Ep. 300: Reflecting on humble beginnings
It is our 300th episode! Tony and Shen reflect on how it all started.
An inside look: How AI powered innovation in the capital markets in 2024
From generative AI and machine learning to more classical forms of AI, banks, asset managers, exchanges, and vendors looked to large language models, co-pilots, and other tools to drive analytics.
Asset manager Saratoga uses AI to accelerate Ridgeline rollout
The tech provider’s AI assistant helps clients summarize research, client interactions, report generation, as well as interact with the Ridgeline platform.
LSEG rolls out AI-driven collaboration tool, preps Excel tie-in
Nej D’Jelal tells WatersTechnology that the rollout took longer than expected, but more is to come in 2025.
The Waters Cooler: ’Tis the Season!
Everyone is burned out and tired and wants to just chillax in the warm watching some Securities and Exchange Commission videos on YouTube. No? Just me?
It’s just semantics: The web standard that could replace the identifiers you love to hate
Data ontologists say that the IRI, a cousin of the humble URL, could put the various wars over identity resolution to bed—for good.
T. Rowe Price’s Tasitsiomi on the pitfalls of data and the allures of AI
The asset manager’s head of AI and investments data science gets candid on the hype around generative AI and data transparency.
As vulnerability patching gets overwhelming, it’s no-code’s time to shine
Waters Wrap: A large US bank is going all in on a no-code provider in an effort to move away from its Java stack. The bank’s CIO tells Anthony they expect more CIOs to follow this dev movement.