Opening Cross: The Longevity of Latency as Smarts Outpaces Speed
![max-bowie max-bowie](/sites/default/files/styles/landscape_750_463/public/import/IMG/807/101807/max-bowie-incisivemedia-color.jpg.webp?h=ee12d8fd&itok=FIjEj0Li)
Though data latency attracts a lot of attention because of its necessity for algorithmic—not just high-frequency—trading, it isn’t the only game in town. And because it is bounded by physical limits—i.e. the speed of light, or whatever is faster than light, for when we find a way to transmit data by some other means—it has a limited shelf life for delivering competitive advantage, compared to inputs that might yield more value, long-term.
Meantime, the low-latency marketplace continues to grow—by 1.5 percent in 2012 and 4.5 percent over the next three years, according to Tabb Group, which places current sell-side spend on data distribution technologies at $3.6 billion.
And beyond the most liquid, exchange-traded asset classes already straining the limits of latency, the over-the-counter markets have a long way to go before they exhaust the potential for latency reduction. But firms are already applying low-latency technologies in these markets, and will surely expand them to “low-frequency” asset classes as the dynamics of those instruments change due to shifts toward centrally-cleared venues and as investors seek assets with higher potential returns.
This could prompt institutional traders to desert unprofitable equity markets completely for OTC assets, contributing to the rapid evolution of those markets and increased data demand, but having the reverse effect on exchanges, which would need to leverage other business models to maintain revenues, such as increasing their focus on derivatives trading, clearing and—as BT’s Chris Pickles suggests in this issue’s Open Platform—being a neutral “messaging hub” between markets and participants.
This would free up equity markets to fulfill what some argue is their true role—enabling companies to raise capital, rather than being barometers of short-term volatility—and increase their appeal to long-term investors concerned about being outpaced by high-frequency traders.
With a different makeup of participants, exchanges may also have to provide more data free of charge for lower-end investors—not an appealing prospect, as data revenues grew in Q1 while overall exchange revenues fell. However, they could offset any losses by leveraging their central position as aggregators of liquidity and information to capture more data, translate that into new types of datasets and signals, and charge a premium for it. Demand is growing for exchange-like data on OTC asset classes, such as the Datavision Streaming tick-by-tick data service for OTC credit instruments launched last week by credit specialist CMA based on prices from market participants, or Benchmark Solutions’ streaming market-driven pricing, or even transaction cost analysis for markets like currencies—such as that launched last week by agency broker ITG—which could provide an additional input for decision support.
And factors currently used to assess risk could be applied to create new trading indicators. For example, risk and portfolio analysis tools provider Axioma last week presented its quarterly risk review, revealing lower risk and volatility levels across global markets—except China—in Q1 than in the previous quarter.
One way to reduce risk is to diversify by minimizing correlation, since a “diverse” portfolio of stocks that behave similarly is not really diverse at all, while futures prices may not accurately reflect underlyings because of factors priced into the future—for example, oil futures include the cost of transportation—so Axioma creates synthetic prices based on other factors affecting an asset that more accurately reflect its value. Though designed to support long-term decisions, rather than tracking intraday price movements, why couldn’t these models be used to create real-time synthetic prices that expose market inefficiencies in future?
So, will low latency become less important over time? No, because it becomes the benchmark rather than the cutting edge, and because all the high-performance technology providers will be crucial to calculating valuable new data inputs in a timely manner to meet that benchmark.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
How a Chinese AI firm shook the tech world
DeepSeek’s AI model is the very ethos of doing what you can with what you have.
To unlock $40T private markets, Hamilton Lane embraced automation
In search of greater transparency and higher quality data, asset managers are taking a tech-first approach to resource gathering in an area that has major data problems.
FactSet-LiquidityBook: The buy-side OMS space continues to shrink
Waters Wrap: Anthony spoke with buy-side firms and industry experts to get a feel for how the market is reacting to this latest tie-up.
S&P sees strong demand for GenAI tools as leadership changes hands
The data provider released several AI-enabled tools and augmentations to existing platforms in 2024 and plans to continue to capitalize on the technology in 2025.
To modernize loan markets, making data more accessible is key
Wilmington Trust is using AccessFintech’s Synergy platform to ditch faxes and emails in the increasingly popular asset class.
Lucrative market data deal with LSEG fuels Tradeweb’s record quarter
The fixed-income trading venue realized gains from its 2023 deal with the London Stock Exchange Group, amid soaring revenues from market data providers industry-wide.
Is overnight equities trading a fad or the future?
Competition is heating up in US equity markets as more venues look to provide trading from twilight to dawn. But overnight trading has skeptics, and there are technical considerations to address.
DTCC revamps data distribution, collection efforts with cloud, AI
The US clearinghouse is evaluating the possibilities that cloud and AI offer to streamline the processes by which it collects and makes data available to market participants.