Open Platform: To Tame the ‘Zoo’ of Market Data, First Tame the Metadata Monster

Over recent years, the quantity and complexity of the data required for a financial institution to operate effectively has increased dramatically, making it a reasonable assumption that demand for growing quantities of data will continue at an increased rate in the future.
Some of the factors driving this growth are already clear: the tightened regulatory regimes being designed post-credit crunch (Dodd-Frank, EMIR); the possible (if not inevitable) structural Retail Bank and Investment Bank separation; the continuing globalization of the financial markets; and the increasingly pervasive nature of social media information sources. All of these influences will result in more data and increasingly complex data requirements.
This is a multi-faceted problem for financial institutions: how to conform to a new regulatory data regime, take advantage of more sources of data, and turn potentially vast heaps of unstructured and unrelated data into useful information. In addition, data must flow between all parties involved in the financial process—from a client to an outsourced administrator, through to existing and new reporting bodies. More interfaces will be required, resulting in more data flows.
For players in the financial markets, all these factors point to escalating costs to cope with the increasing challenge of managing data in the future. Let’s examine just one small part of this problem—the onboarding of clients by a third-party administrator, which today can already be a lengthy and costly exercise that is unlikely to get easier or cheaper in the future without actively taking steps to tame the problem. But how do you tame something like data, which is rapidly moving from being a semi-domesticated animal to a feral, free-roaming beast?
In particle physics, the term “particle zoo” is used colloquially to describe a relatively extensive list of the known elementary particles. These look almost like hundreds of species in the zoo. According to this theory, however, all particles in the “zoo” have a common ancestry. In a similar fashion, the ever-increasing number of data sources and complexity of data can present a confusing landscape both from a business and technology perspective. To control this “data zoo,” one must first fully “comprehend” it—both in the sense of understanding how it is organized at a basic level, and in the sense of understanding what it encompasses. This order and control is achieved by identifying, analyzing and recording its meaning and behavior in a store—in other words, a metadata repository (or perhaps a “cage” in the “data zoo”).
To take a specific example, from both a client and an outsourced service provider’s perspective, there are significant initial and ongoing costs to both onboarding and maintaining a client’s book of business. The initial costs typically arise in understanding a client’s data thoroughly—both from a semantic and an end-to-end flow perspective. Given the disparate and non-homogenous nature of client front-end systems, data formats and the services being provided, this can be a significant undertaking. The services provided can include trade validation, trade enrichment, matching, settlement, custody, accounting (on occasions, across multiple accounting platforms) and client reporting. All of the systems in a particular business function will invariably have their own particular understanding of data and specific data formats. Further, any mapping and transformation documentation (if it exists) will typically consist of manually produced and maintained spreadsheets—which more often than not contain errors and are out of date. In essence, a large element in the onboarding cost arises because of poor documentation and accessible descriptions of client data, internal data, and data flows.
A vital part of addressing the problems outlined above is to implement a metadata repository. The primary function of this repository should be to consume data models such as messaging formats, data structures, report contents and the associated mappings, transformations and any business rules applied as data moves between these structures. A second function is to hold business descriptions and knowledge about the data: Recording information about the data itself—what it means, how it flows from place to place, and how it is transformed when it is moved—enables users to make use of the data more easily and consistently without being distracted by technical details of how or where data is stored or represented.
Halving Onboarding Costs
Taking this metadata-centric approach, firms can substantially reduce the time—and as a result, cost—of initially onboarding clients from a transaction processing perspective. We estimate this approach can reduce onboarding time and cost by between 50 and 70 percent. This benefit arises primarily from now having a clear and accurate understanding of internal data and data flows, and how these interact with a client’s data.
To achieve this, firms would incur an initial overhead of capturing the internal data flows within the metadata repository, but this is a one-off exercise. Once completed, the results are re-usable for many clients, and the repository starts to yield a host of additional analysis and reporting benefits.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Saugata Saha pilots S&P’s way through data interoperability, AI
Saha, who was named president of S&P Global Market Intelligence last year, details how the company is looking at enterprise data and the success of its early investments in AI.
Data partnerships, outsourced trading, developer wins, Studio Ghibli, and more
The Waters Cooler: CME and Google Cloud reach second base, Visible Alpha settles in at S&P, and another overnight trading venue is approved in this week’s news round-up.
A new data analytics studio born from a large asset manager hits the market
Amundi Asset Management’s tech arm is commercializing a tool that has 500 users at the buy-side firm.
One year on, S&P makes Visible Alpha more visible
The data giant says its acquisition of Visible Alpha last May is enabling it to bring the smaller vendor’s data to a range of new audiences.
Accelerated clearing and settlement, private markets, the future of LSEG’s AIM market, and more
The Waters Cooler: Fitch touts AWS AI for developer productivity, Nasdaq expands tech deal with South American exchanges, National Australia Bank enlists TransFicc, and more in this week’s news roundup.
‘Barcodes’ for market data and how they’ll revolutionize contract compliance
The IMD Wrap: Several recent initiatives could ease arduous data audit and reporting processes. But they need buy-in from all parties if all parties are to benefit.
‘The opaque juggernaut’: Private credit’s data deficiencies become clear
Investor demand to take advantage of the growing private credit markets is rising, despite limited data, trading mechanisms, and a lack of liquidity.
Fitch claims 20% developer productivity boost using AWS GenAI tools
The vendors have expanded an existing deal to include new Amazon tools that have helped Fitch modernize its infrastructure and applications.