Panel: Firms Face Increased Complexity When Measuring Data Quality

Following the financial crisis, it has become more complex to measure data quality, with firms implementing advanced metrics to avoid a dramatic rise in exceptions, according to a panel of speakers at the Paris Financial Information Summit in June.

Panelists said regulators are still focused on data quality and completeness of data, but ensuring data is good quality has become more challenging with high levels of volatility and different ways to define data quality. Paris-based Philippe Rozental, head of asset servicing, Société Générale Securities Services, said volatility in the market resulted in a high number of exceptions from data validation rules.

This has led to firms having to change their approach to measuring data quality. London-based Jean Williams, vice-president, software solutions, Asset Control, says the rule requirements are a lot more complex now. "It does illustrate we're getting more and more complex and more specific business-generated manipulation of the data," she said.

It is no longer enough to have rules that identify which source to use where. London-based Brian Sentance, CEO, Xenomorph, said the simple rule sets traditionally used in EDM projects have now evolved to become rule sets that might involve something about volatility. "There has been an increase in complexity in rules applied, so that institutions can avoid their data management teams having people checking ever more exceptions," he said.

In addition, users have to deal with the fact that data quality can be subjective. Paris-based Henri Mocka, head of fund accounting strategic operating model, BNP Paribas, said consumers view the data differently, and it is a "very complex world." The data has to be integrated for different types of internal clients, he said.

Different internal departments may use the data differently, and there may be differences in how they want to view the data. London-based Llew Nagle, head of data quality, Barclays Capital, said: "Data quality is defined by who is using it."

Meanwhile, the importance of quality and standardization is also highlighted in the data management projects prioritized at the moment. Panelists said the type of projects they were working on at the moment were integrating fund accounting and pricing systems, creating a common internal identifier for funds and clients, and setting standard policies for reference data.

London-based Gert Raeves, senior vice-president, partnerships and marketing, GoldenSource, said it is a case of "nothing has changed and everything has changed." There has been a fundamental shift in urgency, he said, and the market changes have "allowed us to articulate the value [of data management projects] in a completely different way."

 

 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here