The Data Challenge of Systemic Risk
Getting the Roles of Government and Industry Right
Following its late September workshop in Basel, Switzerland to discuss a global identification system being considered to address systemic risk issues, the G-20 Financial Stability Board on October 15 called for "a global legal entity identifier system which uniquely identifies parties to financial transactions with an appropriate governance structure representing public interest." The US Treasury's Office of Financial Research (OFR), chartered under the Dodd-Frank Act, has already been active in this arena, issuing plans for legal entity identifier (LEI) standards.
US agencies, International Organization of Securities Commissions (Iosco) and the Bank for International Settlements (BIS) are seeing a global LEI system as a pre-requisite to analyzing systemic risk that will allow regulators to better see what they need to judge or prevent events that would bring another financial crisis. To do this, it is understood they need data transparency and an ability to aggregate data consistently.
Without unique, unambiguous and universal computer-usable identifiers, the global financial industry inevitably gets multiple versions of identical identifying information. The impact is predictable—transactions that need to match for payment and settlement, and transactions conducted by the same counterparty in the same products that need to be aggregated into positions for risk analysis, do not match, nor do they get aggregated properly. Because systemically important financial institutions are global and transcend sovereign governments' reach, local regulators' rules and even regional compacts, regulatory oversight is neither timely nor comprehensive.
We now realize we have no way of "seeing" the same counterparty's risk exposure across the different financial firms with which it obtains loans from, enters into swaps contracts with, or sets risk exposure limits for. In the US, the OFR is empowered to standardize the types and formats of data to be collected from financial firms. The data being requested will find its way into a newly created data center overseen and perhaps run by the OFR. The data will contain an unprecedented level of granular information, including information on positions, transactions, valuation methods, cashflows and identities of counterparties. This level of granularity is required to make the necessary calculations for analyzing systemic risk. This data had previously been only available periodically to on-site examiners of individual financial institutions.
Such granular and comprehensive data has never been requested or concentrated within one government financial oversight agency, much less at comparable scale and frequency. A myriad of global economic, market and company-specific data will also have to be sourced from hundreds of data vendors and government sources. Policies for computing systemic risk exposures will need to be set. For example, policies will need to be developed on the tolerances for the amount of systemic risk that should be allowed. Dynamic scenarios must be stress tested against the collected data for catastrophic events associated with everything imaginable, from oil spills to weather to war. Volatility, liquidity, capital and leverage gauges must be calibrated and also stress-tested around these scenarios. The OFR will need a variety of analytical tools, yet to be developed, to sift through unprecedented quantities of data pouring in from financial institutions.
What is important to the financial industry is that the division of labor between government and industry is appropriately set in this endeavor. Here, it is obvious that regulators can make a significant contribution by mandating a global standard. They can also create their own tools for analyzing systemic risk. However, given that the government has expressed its belief that much costs and risk will be removed from the infrastructure of financial institutions because of this global standard, it is up to the industry to leverage such global regulatory compulsion for its own benefit, if the claim made by government is true.
No government can extract the cost savings and reduce risk in financial institutions promised by the global identification standard. Only those financial institutions affected—the largest, globally active ones, the "too big to fail" ones, the systemically important ones—systemically important financial institutions (Sifis) as they are called in the Dodd-Frank Act—can do this. Where are these firms in stepping up to claim the benefit that the government is challenging them is there? Is it there? I and my colleagues have estimated that there is $1 billion annually to be saved by each financial institution. How? By sharing the costs and mutualizing the risk in a common reference data utility, a central counterparty for data management.
The idea expressed by some that a common standard will eliminate silos of duplicated reference data, thus saving costs, while valid to a point, does not hold up to the reality of the impediments to enterprise solutions being foisted on silo governance structures. Sounds like it should work but it doesn't.
Without creating a common utility shared by all the largest financial institutions, they will be spending more, not less. They will be paying a government assessment to operate the US government's data center, and perhaps multiple governments' data centers. They will still pay to source data from multiple vendors who will continue to source information from paper documents (with the obvious inherent risk of human error). Financial institutions will still have different end-of-day valuations for the same product, and they will only know about errors in reference data when they try to pay for what they bought and it fails to settle properly.
They will be better able to aggregate data across multiple business silos. However, each business silo will adopt its own rate of transition. Taken together with reconciliation costs between legacy numbering conventions and the global standard, costs will increase over the foreseeable future.
The alternative of a common industry platform for reference data, distributed over an internet-like infrastructure, can achieve significant cost savings over the short term for each firm and a dramatic lowering of costs for the entire industry. After all, the reference data platforms in each firm represent duplicated costs. Install a common utility platform and the savings are obvious. Remember when each firm had its own vault before central securities depositories were created? Only the biggest ones survived, and they're now used as restaurants.
Allan Grody is president of Financial InterGroup Advisors, which has submitted proposals for implementing a global identification system as outlined in this analysis to the OFR, Securities and Exchange Commission and the Commodity Futures Trading Commission.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Regulation
New data granularity rules create opportunities for regtech providers
As evidence, Regnology increased its presence in North America with the addition of Vermeg's Agile business—its 8th acquisition in three years—following a period of constriction and consolidation in the market.
Bond tape hopefuls size up commercial risks as FCA finalizes tender
Consolidated tape bidders say the UK regulator is set to imminently publish crucial final details around technical specifications and data licensing arrangements for the finished infrastructure.
The Waters Cooler: A little crime never hurt nobody
Do you guys remember that 2006 Pitchfork review of Shine On by Jet?
Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T
Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.
BlackRock, BNY see T+1 success in industry collaboration, old frameworks
Industry testing and lessons from the last settlement change from T+3 to T+2 were some of the components that made the May transition run smoothly.
How ‘Bond gadgets’ make tackling data easier for regulators and traders
The IMD Wrap: Everyone loves the hype around AI, especially financial firms. And now, even regulators are getting in on the act. But first... “The name’s Bond; J-AI-mes Bond”
Can the EU and UK reach T+1 together?
Prompted by the North American migration, both jurisdictions are drawing up guidelines for reaching next-day settlement.
Waters Wavelength Ep. 293: Reference Data Drama
Tony and Reb discuss the Financial Data Transparency Act's proposed rules around identifiers and the industry reaction.