Max Bowie: Wherefore Art Thou, Transparency?
The notion of bringing transparency to market data isn’t new. In fact, it has been the impetus for user groups and public forums over the years. However, it is gaining renewed attention in the current market as data providers see the improving economic conditions and seek to raise prices, but end-user firms remain cautious and cost-sensitive, so aren’t really increasing data budgets, and to better manage those costs, are implementing “fairer” cost allocation programs to show business areas exactly what they pay for content and technology.
For example, speakers at Inside Market Data’s recent European Financial Information Summit cited the need for a transparent process around how data is managed, and transparent cost models to allocate costs fairly to business lines in such a way as to make business users aware of the costs they incur—not only for the price of an application or service, but also the cost of shared resources such as networks and hardware required to run and access it, and by how much staff in a particular department use a service compared to others. This not only makes end-users more aware of the costs they incur to the business, but also makes them more proactive about managing costs, speakers said, and reduces frustration with opaque recharges.
On one hand, there’s price transparency (how a source prices its price data): Users frequently lament the lack of any standardized, apples-to-apples pricing for similar datasets between different vendors and exchanges, and how they arrive at the value of their data and translate it into the fees that they charge—something that is also applicable in the case of over-the-counter (OTC) broker data. For example, upstart trading venues traditionally provide market data free of charge to win business before charging fees once they gain a certain market share. However, unlike the model of the consolidated tape, which adjusts each exchange’s share of revenues based on resulting trading activity, exchanges aren’t known for reducing their fees if their market share slips—even though their data is arguably less representative of the market and hence less valuable.
On the other hand, there are the issues of transparency around how providers allow firms to use the data, and more specifically, the lack of any standards or harmony between the terms and policies with which they describe how firms can use it. Speakers at EFIS bemoaned the irony that firms pay for applications and services to support the growth of their business, but that some of these services come with licensing terms that are “revenue-driven, not transparency-driven” and can constrain attempts to grow their business.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors—or that there is little regulatory scrutiny of transparency around data costs and policies. Though US exchanges must obtain Securities and Exchange Commission approval for any new services that introduce new fees, the process is generally viewed as a rubber stamp. So it generally falls to end-users to cajole exchanges into some level of harmonization, because they bear the brunt of interpreting and managing a multitude of different contracts. However, relevant examples of industry cooperation exist, such as the FIX Protocol: Instead of each market having a different routing protocol requiring traders to use different interfaces for each exchange, FIX provided a standard that could replace the costs of using and maintaining multiple proprietary protocols. Similarly, standardized contracts and terms could result in lower legal fees from having a standard, industry-adopted template, and emerging markets exchanges being able to offer their data according to terms already familiar to potential clients in new markets, while easier-to-understand contracts would surely reduce the amount of accidental under-reporting.
It may shock those outside the world of market data that there’s no such thing as a standard contract for essentially the same data service from different markets or vendors.
With bodies such as the World Federation of Exchanges (WFE) becoming more active on standards around issues such as cyber security and information protection, perhaps the WFE could also turn its attention to standardization of contracts and policies, reducing the need for end-users or individual exchanges to carry the bulk of the burden.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
This Week: First Trust/Bloomberg/New Constructs, Cboe/Metaurus, LTX/MultiLynq, and more
A summary of the latest financial technology news.
Waters Wavelength Podcast: S&P’s CTO on AI, data, and the future of datacenters
Frank Tarsillo, CTO at S&P Global Market Intelligence, joins the podcast to discuss the firm’s approach to AI, the importance of data, and what might be in store for datacenters in the coming years.
BMO’s cloud migration strategy eases AI adoption
The Canadian bank is embracing a more digital future as its cloud strategy makes gains and it looks to both traditional machine learning and generative AI for further augmentation.
Waters Wrap: GenAI and rising tides
As banks, asset managers, and vendors ratchet up generative AI experiments and rollouts, Anthony explains why collaboration between business and tech teams is crucial.
Ice moves to meet demand for greater cloud, AI capabilities
The exchange also outlined competitive advantages behind managing its data and cloud strategy internally during its Q1 earnings call on Thursday.
FactSet looks to build on portfolio commentary with AI
Its new solution will allow users to write attribution summaries more quickly and adds to its goal of further accelerating discoverability, automation, and innovation.
How Ally found the key to GenAI at the bottom of a teacup
Risk-and-tech chemistry—plus Microsoft’s flexibility—has seen the US lender leap from experiments to execution.
Most read
- Waters Wavelength Podcast: S&P’s CTO on AI, data, and the future of datacenters
- Chris Edmonds takes the reins at ICE Fixed Income and Data Services
- Waters Wrap: GenAI and rising tides