Panels: Firms Must Cut Internal Latency
"We have many customers who have found that a lot of their systems have hit the wall, and that the only way... to get the response times required... is to co-locate business logic with their stream processing," said Patrick May, director of sales engineering at Gigaspaces, speaking at last week's Waters USA conference.
"Traditionally, we receive a data stream, and store it for [immediate] use and for reporting purposes," said Sinan Baskan, director for financial markets solutions at Sybase, who spoke at the same event. "But now, because of latency requirements, we have to deal with data as several layers of streams... serving different requirements."
Scott Atwell, manager of FIX trading and connectivity at American Century Investments, said that not only do different end-users have differing latency requirements, but that a single firm can have different needs, depending on how data is being consumed. "Different applications are using the same data in different ways... for statistical arbitrage, algorithmic trading... or for testing strategies," each of which have different latency demands, Atwell said.
In addition, different types of firms have widely varying needs, and so while most panelists reported that the entire market is demanding low-latency, Atwell estimated that perhaps only 15 percent of the overall market has major low-latency needs, when taking into account the large numbers of long-only investment managers.
Echoing the sentiment at an event in London hosted by Intel, Kevin Houstoun, director of Bidroute and IT co-chair for the FIX Protocol Limited, said the industry is already seeing the evolution of two-tier data architectures, where banks provide secure and reliable general-purpose architectures, while specialized trading operations focus on very high-speed applications and infrastructures.
To implement the high-speed architectures required for algorithmic trading, however, the next step of latency reduction potentially lies within firm's own infrastructures rather than with low-latency feeds, officials said. "The key is the backbone that keeps that data moving," Baskan said. "Now, models are fragmented, data is fragmented, and moving that requires a lot of technology."
In addition, moving data introduces latency between applications, and complexity as a result of needing to translate between different formats and interfaces, said May. "If you can... centralize [data]... in a single, cohesive unit... you can reduce that latency problem."
But when fine-tuning latency sensitive applications, panelists in London also noted that advances in processor technologies are not always the solution to firms' problems. "Our problem is our I/O [and] our memory access," said Ray O'Brien global head of corporate, investment and transaction banking IT at HSBC.
"We find most of our bottlenecks aren't on the CPU... they're on the network card, they're on disk I/O-they're not on the processor in isolation," agreed Houstoun.
Max Bowie with Jean-Paul Carbonnier
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Trading Tech
After acquisitions, Exegy looks to consolidated offering for further gains
With Vela Trading Systems and Enyx now settled under one roof, the vendor’s strategy is to be a provider across the full trade lifecycle and flex its muscles in the world of FPGAs.
Enough with the ‘Bloomberg Killers’ already
Waters Wrap: Anthony interviews LSEG’s Dean Berry about the Workspace platform, and provides his own thoughts on how that platform and the Terminal have been portrayed over the last few months.
BofA deploys equities tech stack for e-FX
The bank is trying to get ahead of the pack with its new algo and e-FX offerings.
Pre- and post-trade TCA—why does it matter?
How CP+ powers TCA to deliver real-time insights and improve trade performance in complex markets.
Driving effective transaction cost analysis
How institutional investors can optimize their execution strategies through TCA, and the key role accurate benchmarks play in driving more effective TCA.
As NYSE moves toward overnight trading, can one ATS keep its lead?
An innovative approach to market data has helped Blue Ocean ATS become a back-end success story. But now it must contend with industry giants angling to take a piece of its pie.
BlackRock, BNY see T+1 success in industry collaboration, old frameworks
Industry testing and lessons from the last settlement change from T+3 to T+2 were some of the components that made the May transition run smoothly.
Banks seemingly build more than buy, but why?
Waters Wrap: A new report states that banks are increasingly enticed by the idea of building systems in-house, versus being locked into a long-term vendor contract. Anthony explores the reason for this shift.