Panel: Firms Face Increased Complexity When Measuring Data Quality
Following the financial crisis, it has become more complex to measure data quality, with firms implementing advanced metrics to avoid a dramatic rise in exceptions, according to a panel of speakers at the Paris Financial Information Summit in June.
Panelists said regulators are still focused on data quality and completeness of data, but ensuring data is good quality has become more challenging with high levels of volatility and different ways to define data quality. Paris-based Philippe Rozental, head of asset servicing, Société Générale Securities Services, said volatility in the market resulted in a high number of exceptions from data validation rules.
This has led to firms having to change their approach to measuring data quality. London-based Jean Williams, vice-president, software solutions, Asset Control, says the rule requirements are a lot more complex now. "It does illustrate we're getting more and more complex and more specific business-generated manipulation of the data," she said.
It is no longer enough to have rules that identify which source to use where. London-based Brian Sentance, CEO, Xenomorph, said the simple rule sets traditionally used in EDM projects have now evolved to become rule sets that might involve something about volatility. "There has been an increase in complexity in rules applied, so that institutions can avoid their data management teams having people checking ever more exceptions," he said.
In addition, users have to deal with the fact that data quality can be subjective. Paris-based Henri Mocka, head of fund accounting strategic operating model, BNP Paribas, said consumers view the data differently, and it is a "very complex world." The data has to be integrated for different types of internal clients, he said.
Different internal departments may use the data differently, and there may be differences in how they want to view the data. London-based Llew Nagle, head of data quality, Barclays Capital, said: "Data quality is defined by who is using it."
Meanwhile, the importance of quality and standardization is also highlighted in the data management projects prioritized at the moment. Panelists said the type of projects they were working on at the moment were integrating fund accounting and pricing systems, creating a common internal identifier for funds and clients, and setting standard policies for reference data.
London-based Gert Raeves, senior vice-president, partnerships and marketing, GoldenSource, said it is a case of "nothing has changed and everything has changed." There has been a fundamental shift in urgency, he said, and the market changes have "allowed us to articulate the value [of data management projects] in a completely different way."
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Chief data officers must ‘get it done’—but differ on what that means
Voice of the CDO: After years of focus on data quality, governance, and compliance, CDOs are now tasked with supporting the business in generating alpha and driving value. How can firms put a value on the CDO role?
In a world of data-cost overruns, inventory systems are a rising necessity
The IMD Wrap: Max says that to avoid cost controls, demonstrate the value of market data spend.
S&P debuts GenAI ‘Document Intelligence’ for Capital IQ
The new tool provides summaries of lengthy text-based documents such as filings and earnings transcripts and allows users to query the documents with a ChatGPT-style interface.
As NYSE moves toward overnight trading, can one ATS keep its lead?
An innovative approach to market data has helped Blue Ocean ATS become a back-end success story. But now it must contend with industry giants angling to take a piece of its pie.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.
New Bloomberg study finds demand for election-related alt data
In a survey conducted with Coalition Greenwich, the data giant revealed a strong desire among asset managers, economists and analysts for more alternative data from the burgeoning prediction markets.
Waters Rankings 2024 winner’s interview: S&P Global Market Intelligence
S&P Global Market Intelligence won two categories in this year’s Waters Rankings: Best reporting system provider and Best enterprise data management system provider.
How ‘Bond gadgets’ make tackling data easier for regulators and traders
The IMD Wrap: Everyone loves the hype around AI, especially financial firms. And now, even regulators are getting in on the act. But first... “The name’s Bond; J-AI-mes Bond”