Max Bowie: Wherefore Art Thou, Transparency?
The notion of bringing transparency to market data isn’t new. In fact, it has been the impetus for user groups and public forums over the years. However, it is gaining renewed attention in the current market as data providers see the improving economic conditions and seek to raise prices, but end-user firms remain cautious and cost-sensitive, so aren’t really increasing data budgets, and to better manage those costs, are implementing “fairer” cost allocation programs to show business areas exactly what they pay for content and technology.
For example, speakers at Inside Market Data’s recent European Financial Information Summit cited the need for a transparent process around how data is managed, and transparent cost models to allocate costs fairly to business lines in such a way as to make business users aware of the costs they incur—not only for the price of an application or service, but also the cost of shared resources such as networks and hardware required to run and access it, and by how much staff in a particular department use a service compared to others. This not only makes end-users more aware of the costs they incur to the business, but also makes them more proactive about managing costs, speakers said, and reduces frustration with opaque recharges.
On one hand, there’s price transparency (how a source prices its price data): Users frequently lament the lack of any standardized, apples-to-apples pricing for similar datasets between different vendors and exchanges, and how they arrive at the value of their data and translate it into the fees that they charge—something that is also applicable in the case of over-the-counter (OTC) broker data. For example, upstart trading venues traditionally provide market data free of charge to win business before charging fees once they gain a certain market share. However, unlike the model of the consolidated tape, which adjusts each exchange’s share of revenues based on resulting trading activity, exchanges aren’t known for reducing their fees if their market share slips—even though their data is arguably less representative of the market and hence less valuable.
On the other hand, there are the issues of transparency around how providers allow firms to use the data, and more specifically, the lack of any standards or harmony between the terms and policies with which they describe how firms can use it. Speakers at EFIS bemoaned the irony that firms pay for applications and services to support the growth of their business, but that some of these services come with licensing terms that are “revenue-driven, not transparency-driven” and can constrain attempts to grow their business.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors—or that there is little regulatory scrutiny of transparency around data costs and policies. Though US exchanges must obtain Securities and Exchange Commission approval for any new services that introduce new fees, the process is generally viewed as a rubber stamp. So it generally falls to end-users to cajole exchanges into some level of harmonization, because they bear the brunt of interpreting and managing a multitude of different contracts. However, relevant examples of industry cooperation exist, such as the FIX Protocol: Instead of each market having a different routing protocol requiring traders to use different interfaces for each exchange, FIX provided a standard that could replace the costs of using and maintaining multiple proprietary protocols. Similarly, standardized contracts and terms could result in lower legal fees from having a standard, industry-adopted template, and emerging markets exchanges being able to offer their data according to terms already familiar to potential clients in new markets, while easier-to-understand contracts would surely reduce the amount of accidental under-reporting.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors.
With bodies such as the World Federation of Exchanges (WFE) becoming more active on standards around issues such as cyber security and information protection, perhaps the WFE could also turn its attention to standardization of contracts and policies, reducing the need for end-users or individual exchanges to carry the bulk of the burden.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
BNY inks AI deal with Google, Broadridge moves proxy voting to AWS, Expero delivers ICE market data, and more
The Waters Cooler: TSX Venture Exchange data hits the blockchain, SmartTrade acquires Kace, and garage doors link to cloud costs in this week’s news roundup.
Everyone wants to tokenize the assets. What about the data?
The IMD Wrap: With exchanges moving market data on-chain, Wei-Shen believes there’s a need to standardize licensing agreements.
Google, CME say they’ve proved cloud can support HFT—now what?
After demonstrating in September that ultra-low-latency trading can be facilitated in the cloud, the exchange and tech giant are hoping to see barriers to entry come down, particularly as overnight trading looms.
Waters Wavelength Ep. 342: LexisNexis Risk Solutions’ Sophie Lagouanelle
This week, Sophie Lagouanelle, chief product officer for financial crime compliance at LNRS, joins the podcast to discuss trends in the space moving into 2026.
Citadel Securities, BlackRock, Nasdaq mull tokenized equities’ impact on regulations
An SEC panel of broker-dealers, market-makers and crypto specialists debated the ramifications of a future with tokenized equities.
BlackRock and AccessFintech partner, LSEG collabs with OpenAI, Apex launches Pisces service, and more
The Waters Cooler: CJC launches MDC service, Centreon secures Sixth Street investment, UK bond CT update, and more in this week’s news roundup.
Tokenized assets draw interest, but regulation lags behind
Regulators around the globe are showing increased interest in tokenization, but concretely identifying and implementing guardrails and ground rules for tokenized products has remained slow.
CME, LSEG align on market data licensing in GenAI era
The two major exchanges say they are licensing the use case—not the technology.