Virtual Roundtable: Utilities: The Shape of Things to Come?
Six panelists discuss the future of utilities such as SPReD for reference data.
What business drivers are shaping reference data strategies, including choosing a utility, and how?
Amy Young, managing director, State Street: Regulatory requirements around data and documents have been accelerating since the last financial crisis with the same data elements and supporting documentation required by multiple counterparties.
The traditional means of information transmission (e-mail and fax) are becoming difficult, if not impossible, to manage under this acceleration of requests.
Industry participants are turning to utilities with the governance, controls and audit records required in today’s regulated environment as a replacement. User-driven development of services, data security and pricing are key components of choosing a utility.
Virginie O’Shea, senior analyst, Aite Group: The biggest drivers for investment in reference data management are around regulatory compliance, risk management, and cost control. The latter relates to working with a utility model as there is a desire within some firms to drive down costs by pushing high-cost, non-core functions (aka non-revenue generating functions) onto shared services.
Marc Odho, advisory board member and senior consultant, Island 20 Ventures: Senior management at financial institutions are more focused than ever on running the business and driving growth. Instrument reference data management is expensive, and delivers minimal competitive advantage, providing motivation to partner and leverage a utility and/or managed service.
“For something like this to be attractive to a broad customer base, I feel it must have a considerable amount of out-of-the-box content. And to get considerable content on your platform, you need to deal with many data vendors… and the past has shown how challenging it can be to convince vendors to offer up their data to a shared service or utility.” Rob Ord, Scotiabank
Furthermore, technology and data management groups in many financial institutions do not have the capacity or resources to meet the demands of the business at an acceptable cost and time to market.
Managed services and/or utilities offer the promise of lower costs via economies of scale and improved quality of instrument reference data that will eliminate costly “breaks.” These services can address a portion of the increasing demand, thereby freeing up capacity to address other business priorities, to increase revenue, decrease risk, reduce costs, and address regulatory compliance.
Mike Atkin, managing director, EDM Council: There are twin drivers propelling the industry. The first is regulatory and the need to ensure harmonization across repositories (and across firms) so that regulators can compare data, unravel aggregation and feed data into their models for financial stability analysis. The second is for business value: Reference data is a factor of input into financial processes. Trust, confidence and alignment of data promote process automation, support model-driven analysis, provide consistency needed for customer sentiment analysis, and enable complex financial engineering.
Adam Cottingham, vice president, data management services, SmartStream Technologies: Today, many firms are adjusting to conservative market conditions and aligning their business models accordingly. Managing reference data is no longer classified as proprietary and providing a competitive edge. It is now commonly acknowledged that practices are factual in nature and driven by interaction with the market that can be facilitated as shared services to deliver a best practice approach for common processes.
By undertaking this approach, a significant proportion of cost savings targets can be achieved. In the instance of data management, many departments have already been outsourced but through process optimization a further operational offset is expected by diminishing the impact of poor quality reference data through the transaction cycle.
Rob Ord, director and head, Global Wholesale Technology Data Management, Scotiabank: The high-level drivers shaping our reference data strategy haven’t significantly changed in the last five to 10 years: we continue to be driven by cost—how efficient are the data services offered to consumers. We are also being asked to ensure our capabilities are agile and flexible… and that’s becoming more important because of the recent focus on analytics—getting quick access to a wide variety of reference data across multiple technology platforms. And in some cases, this can be at odds with existing operational processes such as end-of-day batch processes and workflows that operate within specific time constraints and rigidly defined content.
The regulation and compliance side has always been there, but the significant change over the past few years has been the BCBS 239 risk data aggregation and risk reporting principles. Satisfying these principles has led many firms to spend an increased amount of effort focused on shoring up various capabilities, including data quality processes and data lineage documentation.
A utility… might be attractive because it could look after a number of these issues that I’ve described—cost efficiency, data quality, data lineage documentation, service-level agreements—and as a potential customer, one could leverage what the utility has already invested in. There is definitely an opportunity to do things once to service multiple customers. Many of the new BCBS 239 requirements are not really unique to any one firm, especially those relating to data quality and lineage, and a utility is positioned nicely to satisfy these.
What are SPReD’s strengths, or generally what are the strengths of any such industry-led reference data utility?
Ord: The primary strength of a utility is that one can benefit from leveraging services that it provides for multiple customers—i.e., the economies of scale argument. Another strength is that customers might be able to get some unique or non-traditional datasets from a utility that might be difficult to acquire otherwise. Data sourced directly from exchanges is an example of this.
For something like this to be attractive to a broad customer base, I feel it must have a considerable amount of out-of-the-box content. And to get considerable content on your platform, you need to deal with many data vendors… and the past has shown how challenging it can be to convince vendors to offer up their data to a shared service or utility. Some are more amenable than others, but they typically put restrictions on the utility in terms of how their content is managed. This might diminish the economies of scale, depending on the technology being leveraged by the utility. So a challenge for SPReD will be loading and serving up a broad set of content that clients want.
Odho: Currently, SPReD’s strength comes from its sponsors, JP Morgan, Morgan Stanley and Goldman Sachs. SPReD’s focus seems to be only on normalization and validation of instrument data—data that will continue to be sourced and licensed from data vendors—and to not cover issuers and counterparties. The sponsors understand the multi-asset instrument reference data management requirement well, spanning consumers across the front office, research, data operations, performance, risk and compliance, and more. Sponsors should be able to drive solutions for the various use cases, with a focus on data quality, which is key, scalability that is vital, and data standardization to deliver cost savings based on economies of scale to the sponsors and to fund the continuous improvement of the service (not one golden copy, but multiple versions of the truth required by different consumer use cases).
But what are the weaknesses? SPReD is not the first reference data utility. Between 2004 and 2008, SunGard offered its Managed Data Service, and Accenture offered its Managed Reference Data Service. SunGard’s MDS was multi-tenant, multiple versions of the truth, and Accenture’s MRDS was focused on arbitrated golden copy. Both had big-name banks as charter customers. However, neither obtained sufficient mass adoption by financial institutions to be viable. The other potential pitfall is scope and integration costs. The scope often does not reach deep enough into the financial institutions operational systems to address downstream quality issues and root cause problems. High switching or conversion costs can undermine the return on investment of a utility’s business case.
Cottingham: SPReD is at the forefront of what is really an evolution in the industry with the achievement of standardization in processing, reduction of operational risk and the increase in service quality within the reference data domain. Contribution and collaboration is key to its effective mutualization of data management processing facilitated by a multi-tenanted environment.
This approach is a solution to endemic data issues that are acknowledged by many firms in the industry, but are often not affordable to resolve. By offsetting the effort to a centralized and industry-led function like SPReD, the heavy lifting of source on-boarding, loading, normalizing and cross-referencing can be done once, while common data fixes are shared. These results can be blended with each individual firm’s control level, along with extensions to deliver a data process at a fraction of the cost and at a vastly higher quality level.
Atkin: SPReD is performing a true utility service by mapping data coming in and normalizing it for consumption by financial service applications. They are not interfering with data vendors’ commercial mechanisms. They are not providing lift-out of IT infrastructure. This normalization function reduces the cost of integration, which are the goals of the utility.
Young: The strengths of industry-led utilities are found in the utilities’ ability to streamline redundant, costly activities that are mandated for industry participants. Replacing these functions requires delivering quality in a consistent manner in order to create operational efficiencies and reduce operational risk.
O’Shea: The concept of economy of scale is the underlying strength of any “utility” or shared service, but reaching sufficient scale is one of the greatest challenges for such a service. Ensuring firms are confident in the security and reliability of a multi-tenant architecture is another key area for any such effort to focus on. The idea of an industry-led effort is both a strength and a challenge: If the industry is in charge of the utility’s direction, the logic is that it will better match its requirements. But gaining consensus across a group of firms, however small, is no mean feat, even in the non-competitive realm of reference data.
What’s the best argument for turning to a reference data utility?
Young: Utilities can remove the burden of activities that are not value-adds to a business and their customers in and of themselves. This enables firms to focus on the products that will better serve their customers, as well as the industry needs of the future.
Cottingham: The mutualized processing of data will lead to a tighter business yield, which improves quality and timeliness to enable a positive network effect of accurate information that will allow a firm to trade effectively. By subscribing and participating with the SPReD Utility, individual firms will be able to realize their data governance policies within a realistic budget. The premise is simply to overcome the inhibitive IT budgets associated with fixing data that do not guarantee results.
SPReD offers an elegant solution to allow firms to apply parameters around how they want data to be used, supported and controlled. They can do this knowing that the core dataset has already passed through an industrialized quality and enrichment methodology to rationalize its factual nature and present it pre-market open and intra-day. So when this information hits a firm’s ecosystem, it becomes easier and cheaper to achieve straight-through processing targets throughout the trade lifecycle, freeing up essential resources to carry out profit-making operations.
Odho: In addition to quality, scalability and cost, the key argument or justification for leveraging utilities or managed services is that it frees up scarce IT and data management capacity and resources—capacity that can then be redirected to address other business priorities for which there is no utility or managed service solution. In many cases, data utilities or managed services can more cost-effectively address firms’ requirements, compared to internal effort.
Ord: If an organization traditionally struggles with managing reference data, this is an opportunity to offload that function to someone with experience and proven capabilities. That being said, the best argument in my mind would be to augment your existing internal data management services with hard-to-get data content… and if they do that well, you might be willing to offload more.
For many firms, there is still a requirement for a central platform internally and hence there is a need to build the capability to manage and distribute reference data to multiple downstream consumers with differing requirements. Depending on a firm’s content needs, adding sourcing to this existing in-house capability may not be a significant effort, which might diminish the value of a utility
Another strong argument would be offering these services to smaller firms; ones with say just a few core systems with a more limited set of data needs. A utility can provide data directly to those customers without the need for an internal platform and operations group managing that data before it gets to the consuming systems.
Atkin: [The ability to] reduce processing costs, simplify integration into consuming environments, and ensure comparability from various sources.
O’Shea: Firms that can successfully work with a utility must first have reached some degree of standardization and agreement on the way in which the data will be delivered and received. The tricky issue of data models aside (most large firms each have their own flavor), the best argument for using a shared service is to bring down total cost of ownership of the data management function over time. Costs may go up in the short term, as work is done to connect to a utility, but the intent is for the costs to be more predictable in the long term.
What reference data complexities are priorities for the industry, and how can those be better serviced through a utility?
O’Shea: One of the biggest challenges for any firm in the current market is dealing with the changing requirements for data consumption—be that internally or externally driven. For example, being able to meet compliance requirements by aggregating certain datasets and ensuring consistent quality of reference data across an organization is particularly important. Logically, if the industry is using similar formats and taxonomies via a utility, then ensuring data is fit for purpose should be easier.
Young: Regulation such as KYC, FATCA, and AML reporting all require data and documentation from industry participants. In many cases, what each recipient requests can differ slightly, and manual means of delivery do not enable leveraging individual efforts.
Utilities can help drive industry standards, resulting in fewer one-off requests and the ability to leverage operational efficiencies.
Atkin: The biggest priority is harmonization of data to facilitate comparability. This would help with adherence to BCBS 239 principles and help with the integration of data across linked processes.
Ord: Nuances across data vendors—such as content delivery formats or how vendors uniquely identify securities and entities—can be a significant challenge. If you have substantial data operations, you have to deal with—and often manage content from—multiple vendors. By leveraging a utility, there is the opportunity to take advantage of how that service handles those data vendor differences.
Normalizing or standardizing content and formats from different sources has always been an issue, and could be one of the value-adds of a utility—i.e., the ability to normalize data into the most common formats required by clients.
Another complexity is that commercial models are different across vendors. So providing some sort of common pricing model for vendor data delivered via the utility would be a significant value-add to the industry.
Cottingham: As firms have grown and contracted in size, initiated tactical IT fixes and amassed more data to feed the business, a legacy of fragmented data silos have evolved. The use of different data vendors, different data models and inconsistent symbology means it can be a challenge to get all data represented via a common language. Maintaining an accurate overview of how data interacts holistically is a massive challenge. Complying with regulations, providing meaningful information to support business decisions, and achieving STP are heavily dependent on an appropriate data foundation, which is too complex to achieve within the constraints of existing legacy infrastructure.
By making data coherent as far up the data value chain as possible, with dependencies attributed across market events, and market behaviors applied throughout the data hierarchy, a market truth can be established that can be made available to all. This approach does not operate within the constraints of each firm’s legacy environment, and its remedy is the sole function of the SPReD utility. This is not a function to introduce more technology into an already confused landscape, but a function to fix data processes with minimal legacy IT impact.
What impact do you expect SPReD will have on data management in the industry?
Odho: Given the October announcement, it is hard to assess the impact at this early stage, since many details have yet to be revealed. Thomson Reuters just announced that it will provide content to SPReD. Which other vendors have signed up as content suppliers? Will Bloomberg provide content?
The industry impact will become clearer, once SPReD matures over several years and the sponsors achieve their milestones to convert over to the utility and decommission their legacy systems and data flows. Decommissioning is easily said, but sometimes takes a very long time to achieve, and is key to achieving the savings and ROI.
In the short term, the high-profile nature of SPReD’s sponsors—JP Morgan, Morgan Stanley and Goldman Sachs—will have the side-effect of providing credibility to utilities and managed services in general, creating an example of how services can be leveraged to accelerate and improve data management capabilities at a contracted predictable cost. There are numerous niche managed services available today (data management platforms, data content aggregators, pricing services, tick data services, analytics, etc.). However, most firms do not want to make a “bet-the-bank” commitment to a single utility, especially from a risk management perspective. Instead, they want flexibility, in terms of timing and choices. Therefore, many firms’ strategy will be to selectively leverage specific managed services that address a particular “pain point,” allowing time for controlled change management and integration of the managed service with minimal disruption to business as usual.
O’Shea: If the utility is successful in gaining significant industry traction, which could take some time, then it could have a very positive impact on the data management function overall. Firms could have much more predictable costs, and reliable and standardized datasets with which to work.
Cottingham: SPReD, through the provision of complete, accurate and timely datasets, will promote operational precision and reduced wastage. Today, every individual firm has to find its own truth from the data it purchases, managing data to correct inaccuracies and support processes that run out-of-sync data delivery. SPReD will share this effort across the industry while presenting data in a usable way that is defined by each firm to minimize integration expense.
Data stewards will become less IT project-focused and instead look for root-cause remedies to recurring issues that affect their businesses. There will be a focus on measuring meaningful metrics across the elements of data that are controlled, such as coverage, timeliness, quality and cost. SPReD will become the mechanism for imposing governance policies that not only measure these metrics but also enforce targets for successful compliance through a tangible and results-driven framework. This will allow firms to align data governance initiatives with the needs of each firm while still minimizing IT lag.
Ord: Having been involved in a managed reference data venture in the past, I can appreciate the challenges faced by SPReD. That being said, it has some significant backing and investment, plus all of these contributing firms have been very successful at building and operating their own internal reference data services and technology. Assuming this experience and know-how is used to construct and shape SPReD’s capabilities, the broader financial industry could benefit, as the growth of SPReD could help to further define and drive data management best practices… across many areas, including sourcing, operations, data quality, and content licensing.
Atkin: SPReD handles one of the big challenges (harmonization of data from vendors). There are two more, including the integration of data into all the consuming applications, and the management of data quality across the full chain of supply. Its successful adoption removes a hurdle and enables firms to focus on the other areas that matter—namely, using the data for financial innovation and client service.
Young: The impact will be dependent upon the services offered, means of delivery and industry uptake. In the end, the market will determine the industry winners.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Asset manager Saratoga uses AI to accelerate Ridgeline rollout
The tech provider’s AI assistant helps clients summarize research, client interactions, report generation, as well as interact with the Ridgeline platform.
CDOs evolve from traffic cops to purveyors of rocket fuel
As firms start to recognize the inherent value of data, will CDOs—those who safeguard and control access to data—finally get the recognition they deserve?
It’s just semantics: The web standard that could replace the identifiers you love to hate
Data ontologists say that the IRI, a cousin of the humble URL, could put the various wars over identity resolution to bed—for good.
The art of communication: Data pros need better messaging
As the CDO of a tier-one bank puts it, when there’s an imbalance in communication between the data organization and the business (much less other technology heads) “that creates problems.”
Does TP Icap-AWS deal signal the next stage in financial cloud migration?
The IMD Wrap: Amazon’s deal with TP Icap could have been a simple renewal. Instead, it’s the stepping stone towards cloudifying other marketplace operators—and their clients.
T. Rowe Price’s Tasitsiomi on the pitfalls of data and the allures of AI
The asset manager’s head of AI and investments data science gets candid on the hype around generative AI and data transparency.
Waters Wavelength Ep. 298: GenAI in market data, and everything reference data
Reb is back on the podcast to discuss licensing sticking points for market and reference data.
Back to basics: Taxonomies, lineage still stifle data efforts
Voice of the CDO: While data professionals are increasingly showing their value when it comes to analytics and AI adoption, their main job is still—crucially—getting a strong data foundation in place. That starts with taxonomies and lineage.