Opening Cross: If You Can't Find the Grail, Should Latency Devotees Change Religion?

The quest for the holy grail of low latency has been one of the driving forces of innovation in the market data industry over the last decade, but as this becomes harder and more expensive to achieve, firms are focusing on new ways to find value and speed-related gains that don’t only revolve around trying to cut out linear chunks of latency.
Participants in the Latency special report accompanying this week’s issue note that data processing latency now presents the biggest challenge, and that costs will only grow for those firms whose strategies are highly dependent on latency—while others without the resources either have to find a different way to gain an advantage, or “rent” low-latency capabilities from those with the resources. One of the key problems is that latency is a moving target: achieving a certain level of latency doesn’t matter if someone is still faster than you, and everyone else is trying to do exactly the same thing.
One way is to look at related factors like jitter and reliability, which can impact an otherwise low-latency environment, or metrics gained through latency measurement that can be used to create custom datasets to support strategy development and risk management, such as those that latency monitoring software vendor Correlix is enabling clients to create via a new software development kit for extracting data (IMD, March 12). Another is to focus on factors like clock synchronization, because time, money and effort spent saving nanoseconds is wasted if you have microsecond differences between different servers in the same architecture or between your servers and those of the exchange. For example, next week, Deutsche Börse will introduce a new timing solution for trading firms in the co-location facility for its Xetra cash market and Eurex derivatives exchange.
But as participants in the report explain, many of the gains that can be achieved from reducing geographical latency through initiatives like co-location have already been exhausted, except in emerging markets where high-frequency trading has yet to take off. In these markets, fundamental analysis and research—such as that provided by vendors such as ISI Emerging Markets, which is embarking on a comprehensive revamp and expansion of its research platform—can be more useful, especially where data volumes and frequency remain at levels more conducive to deeper research.
In other markets, spiraling volumes not only preclude in-depth analysis, but place enormous pressure on infrastructures stretched to capacity, driving interest in on-demand, virtual computing, and potentially in the ability to rent latency services, as outlined by participants in our Latency report. And with NYSE’s ArcaBook Options and Amex Options feeds accounting for nearly three million messages per second between them, according to the latest statistics from the Financial Information Forum, volumes continue to rise. In fact, the FIF—along with partners Exegy and MarketPrizm—last week rolled out a European version of its MarketDataPeaks historical peak volume portal, to help European firms with traffic analysis and capacity planning, illustrating that capacity can be just as important a factor to take into account as speed itself.
High data volumes also make it more important to route orders in the most efficient way possible—not just in terms of which market you route to, but also which port you connect to, based on which will provide the highest throughput and lowest latency—something that Cape City Command’s new CC-7030 Trade Xccelerator helps firms achieve by analyzing throughput and latency across ports and providing a stream of recommendations of which ports to route trade flow through, and which to avoid because they introduce higher latency.
So where can you find the next big latency advantage? Chances are, by the time you’ve asked the question, you’ve already missed it.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
Market data woes, new and improved partnerships, acquisitions, and more
The Waters Cooler: BNY and OpenAI hold hands, FactSet partners with Interop.io, and trading technology gets more complicated in this week’s news round-up.
Waters Wavelength Ep. 306: Reykjavik and market data
Reb is back on the podcast to talk about her trip to Reykjavik, as well as two market data reports released this month.
BlackRock tests ‘quantum cognition’ AI for high-yield bond picks
The proof of concept uses the Qognitive machine learning model to find liquid substitutes for hard-to-trade securities.
JP Morgan, Eurex push for DLT-driven collateral management
The high-stakes project could be a litmus test for the use of blockchain technology in the capital markets.
For AI’s magic hammer, every problem becomes a nail
A survey by Risk.net finds that banks are embracing a twin-track approach to AI in the front office: productivity tools today; transformation tomorrow.
On GenAI, Citi moves from firm-wide ban to internal roll-out
The bank adopted three specific inward-facing use cases with a unified framework behind them.
How a Chinese AI firm shook the tech world
DeepSeek’s AI model is the very ethos of doing what you can with what you have.
To unlock $40T private markets, Hamilton Lane embraced automation
In search of greater transparency and higher quality data, asset managers are taking a tech-first approach to resource gathering in an area that has major data problems.