Michael Shashoua: What Last Year and This Year Can Teach Us for 2017
The lesson to be learned from 2015 is that improvements in data governance planning need to continue.

Inside Reference Data closed out 2015 with a roundtable of data management experts, seeking to identify trends and challenges likely to dominate the industry in 2016. We heard that data centralization is now the main focus of these experts and their colleagues, with debate about how to achieve that centralization still continuing.
To understand how the industry reached this point, it’s worth looking back at what experts were saying in similar interviews conducted the year before, and how their insights bore out by the end of 2015.
At the end of 2014, reference data management advances seemed likely to be incremental, if they happened at all. The development deemed most likely to occur was that data management technology would mature and its focus would center around integration of data sources, and getting firms to establish data strategies or governance plans.
At that time, we found evidence that many firms were taking on data governance challenges. TIAA-CREF had deployed an “acquisition and attrition” model. Data governance development was helping to support analytics, said Julia Bardmesser of Citi, who emphasized the importance of data standardization. Canadian firms, including TD Bank and Canadian Western Bank, had found benefits from making data governance plans cross-functional.
As 2015 progressed, the industry started to tie data governance work to addressing risk data management, and regarded data governance as a way to better handle data relevant to risk—especially to comply with risk data aggregation guidelines, such as BCBS 239. Last year started out with BCBS 239 driving changes in data infrastructure, but continued with overall readiness to comply still lagging, leaving BCBS 239 as unfinished business in 2016.
Management Methods
Looking forward, as sources from Acadian Asset Management, Chartis Research, Dun & Bradstreet, HSBC and others did for the start of this year, enterprise data management (EDM), master data management (MDM), and the influence of chief data officers are likely to figure in data centralization efforts. Chris Johnson of HSBC cautioned that using an EDM system to centralize data can reduce flexibility, while Robert Iati of Dun & Bradstreet sees consortia such as SmartStream’s SPReD service as influential in breaking down proprietary data silos, thereby facilitating centralization.
With MDM also being used to federate financial industry records, EDM and MDM will have to be harmonized, said consultant Steve Lachaga. The industry cannot be content with good data sill existing in those silos, he added. The capability to integrate and manage multiple databases is certainly available, says Hugh Stewart of Chartis Research, so the industry should capitalize on that by building an expanded data model making it possible to reuse the same data when it’s relevant for compliance reporting and risk management.
All of these ideas and potential advances are promising, but still may require leadership support to make them a reality. The growing influence of CDOs in firms’ leadership improves the chances of such projects being supported and implemented. However, since immediate operational and business demands require most CDOs to focus on data quality and adapting data operations, rather than improvements that would represent progress, according to Brian Buzzelli of Acadian Asset Management, CDOs’ abilities to lead on such visions could be weighed down. HSBC’s Johnson holds out hope, though, that the CDO role will continue to be fluid enough to hold sway on deciding the best ways to manage data and organize its collection and analysis.
Whether CDOs drive the development of data centralization or not, the lesson to be learned from 2015 is that improvements in data governance planning, which still are not complete, need to continue. Otherwise, at the start of 2017, the industry may be looking back and trying to figure out how and why data centralization efforts are stalled or have failed.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
Waters Wavelength Ep. 307: The shrinking OMS landscape
This week, Tony and Nyela discuss FactSet’s recent acquisition of LiquidityBook and what it could signal for trading technology.
Banks urged to track vendor AI use, before it’s too late
Veteran third-party risk manager says contract terms and exit plans are crucial safeguards.
Market data woes, new and improved partnerships, acquisitions, and more
The Waters Cooler: BNY and OpenAI hold hands, FactSet partners with Interop.io, and trading technology gets more complicated in this week’s news round-up.
Waters Wavelength Ep. 306: Reykjavik and market data
Reb is back on the podcast to talk about her trip to Reykjavik, as well as two market data reports released this month.
BlackRock tests ‘quantum cognition’ AI for high-yield bond picks
The proof of concept uses the Qognitive machine learning model to find liquid substitutes for hard-to-trade securities.
JP Morgan, Eurex push for DLT-driven collateral management
The high-stakes project could be a litmus test for the use of blockchain technology in the capital markets.
For AI’s magic hammer, every problem becomes a nail
A survey by Risk.net finds that banks are embracing a twin-track approach to AI in the front office: productivity tools today; transformation tomorrow.
On GenAI, Citi moves from firm-wide ban to internal roll-out
The bank adopted three specific inward-facing use cases with a unified framework behind them.