Firms Increase Focus on Measuring Data Quality in Downstream Systems

michael-mcmorrow

Firms are increasingly starting to measure data quality downstream as data management programs are maturing, officials tell Inside Reference Data.

Data quality has traditionally been measured in the data repository, but not necessarily downstream. "There has been a lot of focus on building robust, quality foundation data stores, like data warehouses, but inadequate focus on making sure this ‘data store quality' carries through to consistent ‘data usage quality'," says Dublin-based Michael

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Sorry, our subscription options are not loading right now

Please try again later. Get in touch with our customer services team if this issue persists.

New to Waterstechnology? View our subscription options

Cutting through the hype surrounding the FDTA rulemaking process

A bill requiring US regulators and institutions to adopt a machine-readable data framework for reporting purposes applies to entity identifiers, but not security identifiers, in a crucial difference, writes Scott Preiss, SVP and global head of Cusip Global Services.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here