Aggregate to Accumulate
Risk data aggregation and the challenges in how to divide up data and who is assigned what pieces of data.
Kate Toumazi, Global Head of Risk Data Services, Thomson Reuters, explains how firms should deal with enterprise-wide data that creates complications for dividing or aggregating.
1. How should data dictionaries or definitions be established as a foundation for data aggregation efforts?
In accordance with Basel, financial institutions must have an enterprise approach to how they manage their risk and have a robust system which utilizes consistent data across the entity. It is without question that having a strong data architecture is critical for risk data aggregation, and that a key facet of any firm wide data architecture is having consistent data dictionaries. However, the reality is that for most firms the technical challenges are compounded when differing data dictionaries are used across a firm. In an ideal scenario, firms would pick a best in breed dictionary and look to roll this out across their entire enterprise. This may mean tweaking existing capabilities such that a broader array of data can be harmonised into a single and more scalable model. A recent survey of globally systemically important banks (G-SIBS) further highlighted the challenge firms are facing, when it showed an increase in the number of banks who are unlikely to be compliant with BCBS 239 implementation by the 2016 deadline. In fact, more than half of those surveyed said they are not going to be ready. This truly underscores the complexity of the challenge, which is growing not shrinking and a need for a solution remains critical. We can all hear the regulatory clock ticking and firms need to work towards the best viable solution for their business given their existing infrastructures.
2. Who should the stakeholders be and what should their roles be, when assigning responsibilities for data domains?
To comply with Basel, firms must be proactive in how they provide governance and oversight to their risk systems, policies and procedures. They must truly own how they are measuring and mitigating risk. In light of this, one of the biggest organizational changes we have seen across numerous firms is the appointment of a Chief Data Officer who reports to or operates for the board. We believe this trend will continue for the following reasons firstly, by elevating the importance of the data function within the organization, firms are highlighting the strategic importance of getting it right. Secondly, and perhaps more importantly, it specifically assigns accountibility to a senior individual. It is clear that the stakeholders for risk data aggregation sit across numerous parts of the organization including risk, finance, IT and data operations and that these functions must all work together to create the overall structure and composition of the governance and delivery organization. Front office and back office are often not joined up and the front office specifically is often not incentivized to input accurate data which results in manual interventions later to correct the data. By having a single senior figure responsible for data across the organization many firms are looking to address these problems and are far more likely to succeed in spite of the fragmentation.
3. Can enterprise-wide data be broken down, scrutinized and reorganized to address risk management? How should that process work?
A bank should be able to generate accurate and reliable risk data to meet the necessary reporting requirements. To accomplish this, data should be aggregated on a largely automated basis to minimize errors.
Only with an enterprise wide view can the data be aggregated to truly address risk management. Fragmentation is public enemy number one when it comes to aggregated risk management. That does not necessarily mean the only solution to be able to scrutinize the data is a single granular data repository across the entire firm with a single risk management system feeding off this . We, for example, see many banks looking to technology solutions to create a federated model for a single data repository that will add a layer over and above their existing databases to try to create a single data model mid-way through the data lifecycle.
How far banks need to go towards this depends on how consistent their data models are and where they are looking to aggregate their risk either by country, by region, by group or some other level. Even if silos of data are not being physically broken down, one thing is certain, the way collection, storage and maintenance of the data is managed can no longer be done in a silo if firms are to fully address their risk management challenges.
4. What impact is the stress testing regimen of CCAR and BCBS 239 having on risk data aggregation efforts?
The Basel Committee on Banking Supervision states "risk data aggregation" is "defining, gathering and processing risk data according to the bank's reporting requirements to enable the bank to measure its performance against it risk tolerances/appetite". BCBS 239 is core to this statement and its specific data standards highlight the vital role data plays when implementing true Risk Data Aggregation.
The biggest impact we are seeing is increased investment in data aggregation. It goes without saying that post 2008 most major institutions were looking to see how they could improve their aggregation to avoid the same lack of transparency and inability to respond on a timely basis to market and credit risks, but today's regulations are adding the extra pressure. The fact that BCBS 239 has milestones requiring firms to report on their progress also means this has been top of the agenda.
The other facet of the regulations that is different to what may have been in place before is the explicit requirement to provide forward-looking assessment of risk to senior management. This includes forecasts or scenarios for key market variables and the effects on the bank, providing senior management with a much needed view of the likely trajectory of the firm's capital and risk profile in the future. This change adds yet another layer of complexity to what is already a substantial undertaking. It also drives further investment into the data aggregation efforts to ensure that not only are historic / current risk calculations and measures consistent but any future looking views are also modeled in a consistent way.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
As NYSE moves toward overnight trading, can one ATS keep its lead?
An innovative approach to market data has helped Blue Ocean ATS become a back-end success story. But now it must contend with industry giants angling to take a piece of its pie.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.
New Bloomberg study finds demand for election-related alt data
In a survey conducted with Coalition Greenwich, the data giant revealed a strong desire among asset managers, economists and analysts for more alternative data from the burgeoning prediction markets.
Waters Rankings 2024 winner’s interview: S&P Global Market Intelligence
S&P Global Market Intelligence won two categories in this year’s Waters Rankings: Best reporting system provider and Best enterprise data management system provider.
How ‘Bond gadgets’ make tackling data easier for regulators and traders
The IMD Wrap: Everyone loves the hype around AI, especially financial firms. And now, even regulators are getting in on the act. But first... “The name’s Bond; J-AI-mes Bond”
Waters Wavelength Ep. 293: Reference Data Drama
Tony and Reb discuss the Financial Data Transparency Act's proposed rules around identifiers and the industry reaction.
Industry not sold on FIGI mandate for US reg reporting
Banks’ and asset managers’ tortured relationship with Cusip numbers remains tortured, as they tell regulators to keep the taxonomy in play.
Waters Wavelength Ep. 292: Fencore’s James Crosby
James Crosby joins the podcast to talk about the evolution of buy side tech and data decisions.