Quest For Data Quality

michael-shashoua-waters

Stories in this issue of Inside Reference Data cover the different means used to pursue higher data quality, including partnership between service providers and clients, enterprise-strength tools that can bridge gaps in data processes and uses, and improving cooperation between IT and business professionals.

Interactive Data CEO Stephen Daffron tells us that being transparent with clients about the quality controls it uses and the sourcing of its data promotes better understanding of the issues when pricing or other data is incorrect.

Daffron also identifies big data, with its increasing size, granularity and lack of structure, as an opposing force to "cost direction," which is driving costs down by moving reference data sourcing, cleansing and delivery from central systems to cloud computing resources.

Big data, of course, is a broad term, as noted recently in an online-only column "Big Data Terminology". It can refer to issues such as integrating or centralizing data, or technology resources and scalability of data systems. These pieces of big data are really about the pursuit of higher quality data. They are the means used to improve quality, consistency and value.

Enterprise-strength data management tools are available to handle "big data" and connect data governance with other data functions, including quality measurement, access privileging, control and usage, notes CIBC's Peter McGuinness in "Choosing Tools and Setting Models." Just as Daffron points to cost concerns, so do McGuinness and RBC's Patricia Huff in this story. Firms may choose to go in-house if they either cannot make a business case to buy outside providers' tools or if they can more readily build appropriate systems on their own. Anything firms consider buying has to be justified in terms that business executives can understand, McGuinness says.

A question remains-if neither data managers nor business operations managers are likely to walk away satisfied from a negotiation on how to proceed with a data quality effort, as Huff suggests, is it really possible to get meaningful improvement to data quality?

Correcting discrepancies, which Interactive Data approaches with transparency, are also a challenge in the corporate actions space, as Nicholas Hamilton reports. SmartStream's Adam Cottingham says discrepancies can happen due to values varying between sources or additional values turning up in custodians' files. The correct data has to be retrieved from the issuer itself, which can be harder to coordinate with so many parties involved in a corporate action and the ensuing data generated.

And getting the right data together plays a part in complying with regulation, as RBS's David Sharratt reminds us in "Facing Up to the New Regulatory World." The variety of products, asset classes and systems being used in multiple markets by a global firm such as RBS means internal, external and outsourced systems and process all must be managed and marshalled in service to data, Sharratt says. With multiple players all having a stake, even when a choice is made to stick with internal systems, as RBC's Huff related, it becomes evident that pursuing higher quality data can be a tall order that requires widespread support and participation, both among functional units within a firm and in partnership with service providers.

A closing note: With this issue, deputy editor Nicholas Hamilton has completed his work with Inside Reference Data. We will miss him and we wish him well on his next endeavor.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

The AI boom proves a boon for chief data officers

Voice of the CDO: As trading firms incorporate AI and large language models into their investment workflows, there’s a growing realization among firms that their data governance structures are riddled with holes. Enter the chief data officer.

If M&A picks up, who’s on the auction block?

Waters Wrap: With projections that mergers and acquisitions are geared to pick back up in 2025, Anthony reads the tea leaves of 25 of this year’s deals to predict which vendors might be most valuable.

Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T

Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here