Max Bowie: Wherefore Art Thou, Transparency?

The notion of bringing transparency to market data isn’t new. In fact, it has been the impetus for user groups and public forums over the years. However, it is gaining renewed attention in the current market as data providers see the improving economic conditions and seek to raise prices, but end-user firms remain cautious and cost-sensitive, so aren’t really increasing data budgets, and to better manage those costs, are implementing “fairer” cost allocation programs to show business areas exactly what they pay for content and technology.
For example, speakers at Inside Market Data’s recent European Financial Information Summit cited the need for a transparent process around how data is managed, and transparent cost models to allocate costs fairly to business lines in such a way as to make business users aware of the costs they incur—not only for the price of an application or service, but also the cost of shared resources such as networks and hardware required to run and access it, and by how much staff in a particular department use a service compared to others. This not only makes end-users more aware of the costs they incur to the business, but also makes them more proactive about managing costs, speakers said, and reduces frustration with opaque recharges.
On one hand, there’s price transparency (how a source prices its price data): Users frequently lament the lack of any standardized, apples-to-apples pricing for similar datasets between different vendors and exchanges, and how they arrive at the value of their data and translate it into the fees that they charge—something that is also applicable in the case of over-the-counter (OTC) broker data. For example, upstart trading venues traditionally provide market data free of charge to win business before charging fees once they gain a certain market share. However, unlike the model of the consolidated tape, which adjusts each exchange’s share of revenues based on resulting trading activity, exchanges aren’t known for reducing their fees if their market share slips—even though their data is arguably less representative of the market and hence less valuable.
On the other hand, there are the issues of transparency around how providers allow firms to use the data, and more specifically, the lack of any standards or harmony between the terms and policies with which they describe how firms can use it. Speakers at EFIS bemoaned the irony that firms pay for applications and services to support the growth of their business, but that some of these services come with licensing terms that are “revenue-driven, not transparency-driven” and can constrain attempts to grow their business.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors—or that there is little regulatory scrutiny of transparency around data costs and policies. Though US exchanges must obtain Securities and Exchange Commission approval for any new services that introduce new fees, the process is generally viewed as a rubber stamp. So it generally falls to end-users to cajole exchanges into some level of harmonization, because they bear the brunt of interpreting and managing a multitude of different contracts. However, relevant examples of industry cooperation exist, such as the FIX Protocol: Instead of each market having a different routing protocol requiring traders to use different interfaces for each exchange, FIX provided a standard that could replace the costs of using and maintaining multiple proprietary protocols. Similarly, standardized contracts and terms could result in lower legal fees from having a standard, industry-adopted template, and emerging markets exchanges being able to offer their data according to terms already familiar to potential clients in new markets, while easier-to-understand contracts would surely reduce the amount of accidental under-reporting.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors.
With bodies such as the World Federation of Exchanges (WFE) becoming more active on standards around issues such as cyber security and information protection, perhaps the WFE could also turn its attention to standardization of contracts and policies, reducing the need for end-users or individual exchanges to carry the bulk of the burden.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
DeepSeek success spurs banks to consider do-it-yourself AI
Chinese LLM resets price tag for in-house systems—and could also nudge banks towards open-source models.
Standard Chartered goes from spectator to player in digital asset game
The bank’s digital assets custody offering is underpinned by an open API and modular infrastructure, allowing it to potentially add a secondary back-end system provider.
Saugata Saha pilots S&P’s way through data interoperability, AI
Saha, who was named president of S&P Global Market Intelligence last year, details how the company is looking at enterprise data and the success of its early investments in AI.
Data partnerships, outsourced trading, developer wins, Studio Ghibli, and more
The Waters Cooler: CME and Google Cloud reach second base, Visible Alpha settles in at S&P, and another overnight trading venue is approved in this week’s news round-up.
Are we really moving on from GenAI already?
Waters Wrap: Agentic AI is becoming an increasingly hot topic, but Anthony says that shouldn’t come at the expense of generative AI.
Cloud infrastructure’s role in agentic AI
The financial services industry’s AI-driven future will require even greater reliance on cloud. A well-architected framework is key, write IBM’s Gautam Kumar and Raja Basu.
Waters Wavelength Ep. 310: SigTech’s Bin Ren
This week, SigTech’s CEO Bin Ren joins Eliot to discuss GenAI’s progress since ChatGPT’s emergence in 2022, agentic AI, and challenges with regulating AI.
Microsoft exec: ‘Generative AI is completely passé. This is the year of agentic AI’
Microsoft’s Symon Garfield said that AI advancements are prompting financial services firms to change their approach to integrating AI-powered solutions.