Quest For Data Quality
Stories in this issue of Inside Reference Data cover the different means used to pursue higher data quality, including partnership between service providers and clients, enterprise-strength tools that can bridge gaps in data processes and uses, and improving cooperation between IT and business professionals.
Interactive Data CEO Stephen Daffron tells us that being transparent with clients about the quality controls it uses and the sourcing of its data promotes better understanding of the issues when pricing or other data is incorrect.
Daffron also identifies big data, with its increasing size, granularity and lack of structure, as an opposing force to "cost direction," which is driving costs down by moving reference data sourcing, cleansing and delivery from central systems to cloud computing resources.
Big data, of course, is a broad term, as noted recently in an online-only column "Big Data Terminology". It can refer to issues such as integrating or centralizing data, or technology resources and scalability of data systems. These pieces of big data are really about the pursuit of higher quality data. They are the means used to improve quality, consistency and value.
Enterprise-strength data management tools are available to handle "big data" and connect data governance with other data functions, including quality measurement, access privileging, control and usage, notes CIBC's Peter McGuinness in "Choosing Tools and Setting Models." Just as Daffron points to cost concerns, so do McGuinness and RBC's Patricia Huff in this story. Firms may choose to go in-house if they either cannot make a business case to buy outside providers' tools or if they can more readily build appropriate systems on their own. Anything firms consider buying has to be justified in terms that business executives can understand, McGuinness says.
A question remains-if neither data managers nor business operations managers are likely to walk away satisfied from a negotiation on how to proceed with a data quality effort, as Huff suggests, is it really possible to get meaningful improvement to data quality?
Correcting discrepancies, which Interactive Data approaches with transparency, are also a challenge in the corporate actions space, as Nicholas Hamilton reports. SmartStream's Adam Cottingham says discrepancies can happen due to values varying between sources or additional values turning up in custodians' files. The correct data has to be retrieved from the issuer itself, which can be harder to coordinate with so many parties involved in a corporate action and the ensuing data generated.
And getting the right data together plays a part in complying with regulation, as RBS's David Sharratt reminds us in "Facing Up to the New Regulatory World." The variety of products, asset classes and systems being used in multiple markets by a global firm such as RBS means internal, external and outsourced systems and process all must be managed and marshalled in service to data, Sharratt says. With multiple players all having a stake, even when a choice is made to stick with internal systems, as RBC's Huff related, it becomes evident that pursuing higher quality data can be a tall order that requires widespread support and participation, both among functional units within a firm and in partnership with service providers.
A closing note: With this issue, deputy editor Nicholas Hamilton has completed his work with Inside Reference Data. We will miss him and we wish him well on his next endeavor.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Waters Wrap: Quants, CDOs, and the blending of job titles
Anthony explains how a quant at a massive bank taking on the CDO title hints at larger industry changes.
Capital markets firms wary of cloud overspend
Data architects highlight cost concerns as more and more institutions look to use the cloud for data storage and management.
Nasdaq to market new options strike listing tech to other exchanges
The exchange operator is experimenting with emerging technologies to determine which options strike prices belong in a crowded market, with hopes to sell the tech to its peers.
Waters Wavelength Podcast: Bloomberg’s Tony McManus
Tony McManus, global head of enterprise data division at Bloomberg, joins the podcast to talk about the importance of data in the context of AI and GenAI.
Putting the ‘A’ in CDO: The rise of the chief data and analytics officer
As data and analytics become more intertwined, banks and vendors are creating a new role—the chief data and analytics officer—to help them take advantage of the opportunities it presents. It may sound easy, but rethinking data can be a gargantuan task.
The IMD Wrap: Talk about ‘live’ data, NAFIS 2024 is here
This year’s North American Financial Information Summit takes place this week, with an expanded agenda. Max highlights some of the must-attend sessions and new topics. But first, a history lesson...
Waters Wavelength Podcast: S&P’s CTO on AI, data, and the future of datacenters
Frank Tarsillo, CTO at S&P Global Market Intelligence, joins the podcast to discuss the firm’s approach to AI, the importance of data, and what might be in store for datacenters in the coming years.
Breaking out of the cells: banks’ long goodbye to spreadsheets
Dealers are cutting back on their use of Excel amid tighter regulation and risk concerns.
Most read
- IMD & IRD Awards 2024: All the winners
- Waters Wavelength Podcast: Bloomberg’s Tony McManus
- Best evaluated pricing service provider/vendor—CanDeal Data & Analytics