CDOs Must Build Bridges, Not Silos
"Market data and reference data functions are, if not unifying into a single unit, converging under a common structure."
The separation of market and reference data functions that took place over a decade ago was not without good cause: Until then, reference data was largely managed within a firm’s market data function, and didn’t get the recognition and support from senior management that it deserved. If memory serves me correctly, poor reference data was the number one cause of failed trades—a cost that has been eliminated to a large degree as a result of vendors like Markit creating products such as the Reference Entity Database, and firms getting their houses in order, creating internal “golden copies” of securities master data, and critically, translating those failed trades into a profit-and-loss (P&L) argument that alarmed senior managers enough to set aside separate budgets for reference data projects, processes and staff.
However, the challenges associated with the sheer volumes and complexity of data now being captured, processed and monitored by financial firms mean that data is a much bigger challenge than in previous years, and therefore accounts for a larger share of budget, and has inherent in it higher levels of risk. To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management—from market data to reference data, from data held in internal documents to confidential client data. To perform these roles successfully, CDOs must work hand-in-hand with various other departments, from operations to trading functions. And, of course, they have direct oversight of the most data-intensive areas of all—their market data and reference data departments.
As a result, market data and reference data functions are, if not unifying into a single unit, converging under a common structure. This became more evident than ever at the recent European Financial Information Summit in London, where market data professionals were as concerned about provenance as about prices, and about legal entity identifiers (LEIs) as much as latency, and where reference data experts were as concerned about real-time changes to information as they were about traditional static data.
This convergence also exists beyond the world of end-user firms: For example, enterprise data management software platform vendor GoldenSource has fully integrated its market data management module with its core suite of EDM capabilities. According to the vendor, this will not only help centralize overall data management, but will also make it easier to add coverage of new datasets, and to manage a firm’s response to regulatory requirements—such as the Fundamental Review of the Trading Book (FRTB) proposals from the Bank for International Settlements’ Basel Committee on Banking Supervision, which will take effect in 2019, which GoldenSource managing director of sales and client operations Neill Vanlint says “will change forever the way that risk and finance manage data”—centrally, where those regulatory demands require access to market data and reference data.
To be sure, another reason for this change is the increasingly strict and burdensome regulatory environment. This is not to say that firms believe bringing these groups closer will directly save money, but rather that by creating closer ties between all data assets—and the people, systems and groups that govern them—they will be better placed to obtain a single and more accurate view of their data, and will also therefore be better placed to respond quickly and accurately to regulatory reporting demands from regulators, minimizing both fines and the cost of providing this function.
To address the reality that trading firms increasingly have more in common with data processing firms, banks and asset managers alike are appointing chief data officers to oversee all aspects of a firm’s data management
It seems that as we see greater divergence of datasets themselves as new types of data evolve, and others are separated from one another for practical and budgetary purposes, it will become even more important that the management functions that govern that data must do the opposite, and converge in order to manage this ever-broadening array of data assets.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
Waters Wavelength Podcast: Broadridge’s Joseph Lo on GPTs
Joseph Lo, head of enterprise platforms at Broadridge, joins the podcast to discuss AI tools.
Man Group CTO eyes ‘significant impact’ for genAI across the fund
Man Group’s Gary Collier discussed the potential merits of and use cases for generative AI across the business at an event in London hosted by Bloomberg.
BNY Mellon deploys Nvidia DGX SuperPOD, identifies hundreds of AI use cases
BNY Mellon says it is the first bank to deploy Nvidia’s AI datacenter infrastructure, as it joins an increasing number of Wall Street firms that are embracing AI technologies.
This Week: Linedata acquires DreamQuark, Tradeweb, Rimes, Genesis, and more
A summary of some of the latest financial technology news.
Systematic tools gain favor in fixed income
Automation is enabling systematic strategies in fixed income that were previously reserved for equities trading. The tech gap between the two may be closing, but differences remain.
Euronext microwave link aims to cut HFT advantage in Europe
Exchange plans to level playing field between prop firms and banks in cash equities with cutting edge tech.
Why recent failures are a catalyst for DLT’s success
Deutsche Bank’s Mathew Kathayanat and Jie Yi Lee argue that DLT's high-profile failures don't mean the technology is dead. Now that the hype has died down, the path is cleared for more measured decisions about DLT’s applications.
‘Very careful thought’: T+1 will introduce costs, complexities for ETF traders
When the US moves to T+1 at the end of May 2024, firms trading ETFs will need to automate their workflows as much as possible to avoid "settlement misalignment" and additional costs.
Most read
- Deutsche Börse democratizes data with Marketplace offering
- Sell-Side Technology Awards 2024: All the winners
- Sell-Side Technology Awards 2024: Best sell-side front-office platform—Bloomberg