Quest For Data Quality
Stories in this issue of Inside Reference Data cover the different means used to pursue higher data quality, including partnership between service providers and clients, enterprise-strength tools that can bridge gaps in data processes and uses, and improving cooperation between IT and business professionals.
Interactive Data CEO Stephen Daffron tells us that being transparent with clients about the quality controls it uses and the sourcing of its data promotes better understanding of the issues when pricing or other data is incorrect.
Daffron also identifies big data, with its increasing size, granularity and lack of structure, as an opposing force to "cost direction," which is driving costs down by moving reference data sourcing, cleansing and delivery from central systems to cloud computing resources.
Big data, of course, is a broad term, as noted recently in an online-only column "Big Data Terminology". It can refer to issues such as integrating or centralizing data, or technology resources and scalability of data systems. These pieces of big data are really about the pursuit of higher quality data. They are the means used to improve quality, consistency and value.
Enterprise-strength data management tools are available to handle "big data" and connect data governance with other data functions, including quality measurement, access privileging, control and usage, notes CIBC's Peter McGuinness in "Choosing Tools and Setting Models." Just as Daffron points to cost concerns, so do McGuinness and RBC's Patricia Huff in this story. Firms may choose to go in-house if they either cannot make a business case to buy outside providers' tools or if they can more readily build appropriate systems on their own. Anything firms consider buying has to be justified in terms that business executives can understand, McGuinness says.
A question remains-if neither data managers nor business operations managers are likely to walk away satisfied from a negotiation on how to proceed with a data quality effort, as Huff suggests, is it really possible to get meaningful improvement to data quality?
Correcting discrepancies, which Interactive Data approaches with transparency, are also a challenge in the corporate actions space, as Nicholas Hamilton reports. SmartStream's Adam Cottingham says discrepancies can happen due to values varying between sources or additional values turning up in custodians' files. The correct data has to be retrieved from the issuer itself, which can be harder to coordinate with so many parties involved in a corporate action and the ensuing data generated.
And getting the right data together plays a part in complying with regulation, as RBS's David Sharratt reminds us in "Facing Up to the New Regulatory World." The variety of products, asset classes and systems being used in multiple markets by a global firm such as RBS means internal, external and outsourced systems and process all must be managed and marshalled in service to data, Sharratt says. With multiple players all having a stake, even when a choice is made to stick with internal systems, as RBC's Huff related, it becomes evident that pursuing higher quality data can be a tall order that requires widespread support and participation, both among functional units within a firm and in partnership with service providers.
A closing note: With this issue, deputy editor Nicholas Hamilton has completed his work with Inside Reference Data. We will miss him and we wish him well on his next endeavor.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
LSEG partners with Citi, DTCC goes on-chain, AI on the brain, and more
The Waters Cooler: Trading Technologies buys OpenGamma, CT Plan updates, and the beginning of benchmarking in this week’s news roundup.
AI & data enablement: A looming reality or pipe dream?
Waters Wrap: The promise of AI and agents is massive, and real-world success stories are trickling out. But Anthony notes that firms still need to be hyper-focused on getting the data foundation correct before adding layers.
Data managers worry lack of funding, staffing will hinder AI ambitions
Nearly two-thirds of respondents to WatersTechnology’s data benchmark survey rated the pressure they’re receiving from senior executives and the board as very high. But is the money flowing for talent and data management?
Data standardization is the ‘trust accelerator’ for broader AI adoption
In this guest column, data product managers at Fitch Solutions explain AI’s impact on credit and investment risk management.
As AI pressures mount, banks split on how to handle staffing
Benchmarking: Over the next 12 months, almost a third of G-Sib respondents said they plan to decrease headcount in their data function.
Everyone wants to tokenize the assets. What about the data?
The IMD Wrap: With exchanges moving market data on-chain, Wei-Shen believes there’s a need to standardize licensing agreements.
FIX Trading Community recommends data practices for European CTs
The industry association has published practices and workflows using FIX messaging standards for the upcoming EU consolidated tapes.
TCB Data-Broadhead pairing highlights challenges of market data management
Waters Wrap: The vendors are hoping that blending TCB’s reporting infrastructure with Broadhead’s DLT-backed digital contract and auditing engine will be the cure for data rights management.