The Next Big Lift?
Tim Bourgaize Murray explores opportunities for brokers to take on more of buy-side firms' data management burden.
Your phone provider tracks your location. The fridge can ping you when you’re low on milk. Your watch asks you to take a minute and breathe. Cars will soon drive us around by themselves. Yes, the Internet of Things requires massive leaps of innovation in 2017, but at its core there’s still a human choice—to augment the relationship we’ve once had with these machines, and the companies that build them. The reasons aren’t hard to see: cheaper, easier, healthier all come to mind. Automatic. And what do the companies get out of all this? Customer loyalty as we come to rely on them more and more—and, just as lucrative, data.
It may be years before a Bloomberg terminal can really learn and adapt to a trader’s sixth sense. But for a bevy of reasons, buy-side data users today are looking throughout the financial ecosystem to delegate more data responsibilities. Part of it is a cost issue: market data access is becoming prohibitively expensive, while maintaining proprietary infrastructure is intensive, its value increasingly “arbitraged away,” as one source put it. But what’s new is that some brokers are pondering improved dexterity in this crowded space, betting that the more data activities a buy-side client parks with them, the less likely they are to stray. What they’re finding is interesting: lots of space to run, but a lot of obstacles, too. So, where can this kind of arrangement work best, and where do brokers and clients still draw the line? After all your Apple Watch, sensing a low heart rate, may coyly instruct you to jump right off a cliff. That doesn’t mean you will—unless you just happen to be harnessed and ready.
The Opportunity’s There
As Inside Data Management reported in December, both asset owners and asset managers are rapidly externalizing as many data management responsibilities, including market data sourcing, as they can. A growing variety of providers have jumped in to vie for the opportunity to help them do so. Yet one very important class of actor was largely missing in that discussion: the sell side.
It’s not for lack of opportunity. Market data fees have become closely-guarded revenue centers for exchanges. They are estimated to have risen between 60 and 65 percent between 2010 and 2015 alone, while venue fragmentation over that same period has pushed certain markets—currencies and options, especially—to the operational brink, and left end-users feeling pinched.
“The democratization of processing power, the quality of data, such as CME’s new Market Data 3.0 release, and the emergence of firms from the Fintech arena who can address some of the key challenges in a more direct and responsive way than traditional banks, are certainly leveling the playing field.” Jonty Field, Quantitative Brokers
Troy Googins, a solutions architect and head of product management at Wolverine Execution Services (WEX), the broker and tech arm of Chicago-based Wolverine Trading, tells IDM there is a “growing desire” among clients to reduce the infrastructure costs associated with trading in several areas: exchange connectivity, market data, and software development. “We offer clients using the WEX Trading Platform (WTP) almost every equity, options, and futures datafeed the exchanges or other providers make available, but we have seen that our clients are making the decision to eliminate some of the more specialized feeds,” he says. “Recently, the price of one of those datafeeds doubled. Nearly every WTP client we contacted decided to drop the feed instead of paying the higher fee.”
Meanwhile, investors are also consolidating their partners. According to Tabb Group’s recent Broker Relationships in an Era of Full Disclosure research survey, which describes a “crossroads” in the buy-sell side relationship, investment managers are looking for primes who can stand out, while slowly but surely peeling off the rest. Technical wherewithal—from liquidity provision on down—is among their criteria, and improved front-end systems, more transparency into dark pools and execution services bundling have all made advances as a result. This is in part to keep up with the rise of independent and buy-side-owned execution arms like Citadel Execution Services and WEX, and buy-side-only trading venues like Luminex.
“There is an advantage for larger firms as they have the resources to invest in the data collection, algorithmic design and modeling for their own specific requirements,” explains Jonty Field, former head of trading analytics at AHL Man Group and EMEA head at Quantitative Brokers, an agency-only broker specializing in interest-rate futures. “That being said, the democratization of processing power, the quality of data, such as CME’s new Market Data 3.0 release, and the emergence of firms from the Fintech arena who can address some of the key challenges in a more direct and responsive way than traditional banks, are certainly leveling the playing field.”
A Natural Next Step?
So, for behemoth and boutique brokers alike, the question is whether market data processing and management can be one of those key challenges—and if it ought to be. WEX’s Googins points out that “As the number of trading venues continues to grow, the ability to have the expertise and the technology to capture market data and execute orders efficiently—whether the client objective is high fill rates, reduced slippage, or a combination of both—becomes more expensive and complex.”
In other words, sell-side firms, just like their clients, will naturally wonder how they can convert that investment into stickiness.
Field says there are two key areas to focus on. On one hand, there is research: “When one is designing strategies for alpha generation, or indeed assessing historic risk metrics for portfolio design and stress testing, the benefits of cloud computing are quite compelling. [On the other hand] the ‘live trading’ perspective is more about optimally processing the data from an exchange and being in a position to act on any signals fast enough to retain their value,” he says. “For this second part, the topics of co-location and hosting, as well as the feed handlers for one’s market data, dominate. The sheer volume of data to ship from the exchange into the cloud means such work is still best done at the venue,” he says.
And this latter area is seeing fresh attention, says David Taylor, chief technology officer at ticker plant appliance and feed handler provider Exegy. The St. Louis-based vendor has certainly seen a maturation in the types of trading being executed—at least so far as the kinds of feeds and normalized data clients are asking for, he says. But another trend has crept up, too. “It’s been interesting, and a little bit of a surprise to see these kinds of clients—agency brokers, sell-side dark pools, ATSs, as well as buy sides—coming to us and asking more and more about a holistic solution,” Taylor says. “They’re asking us about sourcing raw market data, providing hosted environments for equipment, partnering out for other pieces of infrastructure. A much greater diversity of firms are beginning to see the value in it.”
Who Does the Legwork?
There is no disputing the legwork involved in delivering this value. Taylor points to a myriad of requirements: vending the raw feeds, the design of the network—whether it is built for efficiency with a central hub distributing data out, or for performance, where you have direct connections and minimal propagation delay—and normalization that provides actionable trading signals. For example, this is done by determining the best price (NBBO or user-defined) across fragmented markets, or calculating the components’ net asset value (NAV) of an exchange-traded fund against its pricing. All of it, he adds, must be scalable, so that wider coverage of venues can be achieved quickly without impacting performance, and must be able to interface with other front-office technology for flexibility of order routing and pre-trade risk checks. No small task.
Whether that is an advantage or problem for the sell side, though, is up for debate. Taylor sees such a project best suited to the front-end vendor community, while Peter Durkan, chief executive of foreign exchange infrastructure provider Lucera and former head of high-frequency trading at Cantor Fitzgerald, argues that while the market data space is highly competitive, few vendors have the desire or resources to pull it all together like a large prime broker potentially could—a historical byproduct of the space’s many specialized realms.
“Increasingly, we’ve been asked for exchange data, and for a connectivity specialist like Lucera, it’s just a different proposition,” he says. “There’s a lot of legal lift involved to be a vendor of record; it’s non-trivial to collect, disseminate, and potentially cleanse it, but brokers already have this data coming into their systems, are using it in certain ways now, and can leverage those relationships. The incremental cost of scaling those features is relatively small for them; it’s much larger for us.”
The idea of a broker minding your market data raises a number of intriguing questions, the first of which is simple: even if it were cost-effective to build, and lingering mistrust among buy-side firms evaporated, who would use it? Whereas even the largest institutional investors have moved some data management out of house, sources say this more radical idea is still very much one of segmentation—and not just by size.
For example, the Tabb survey notes several comments from traditional asset managers admitting they were openly agnostic to how their orders got filled, as long as their impact was minimal. Conversely, proprietary trading shops and market makers are likely to keep their data processing very close to the vest. One case probably sees limited added value to running extensive execution analytics or sitting nearer to the exchange; the other already feels they can do it better than anyone, including their prime.
Still, some see a soft middle—of newly-seeded managers, or even established mid-size players with a simple market-making thesis. “A lot of asset managers are doing pretty similar things with market data, looking across some level of transaction cost analysis or analyzing corporate actions,” Durkan says. “Cleansing these kinds of data is very well understood and not overly proprietary. To take that off their plate would make you an attractive partner, when you consider that many of these firms have moved beyond static fundamental data to incorporating social media, sentiment analysis and natural language processing into their process, too. In order to distinguish yourself now, you have to consume a huge amount of data. Building infrastructure for all of it doesn’t make a lot of sense.”
Bigger Hurdles
Others cite reasons why they aren’t so sure. For one thing, most exchanges’ fee structures would make it uneconomic for all but the largest brokers to actualize, Taylor explains.
“Working in the past with prime brokers trying to provide clients with a richer market data solution, we’ve often heard mixed results,” he says. “When it hasn’t worked, it’s often because of the cost of the exchange charging for your clients as programmatic or non-display users. Regardless of how you help them amortize the cost of getting connected to the exchange data, the exchange is always going to need to know who is receiving that data. Most of the main US equities exchanges are going to bill those clients directly as non-display users, and unless you’re Bloomberg or Reuters, you’re not going to be able to justify an enterprise-level deal. We’ve seen different ways explored to create a path around that, but a smaller broker just doesn’t have the scale.”
For another thing, while buy sides will typically have only a handful of preferred brokers and may well be whittling down their total number of relationships, using a broker to process market data heaps more key dependency upon one of them, when clients—and increasingly, regulators—still want a level of flexibility. Quantitative Brokers’ Field cites a familiar culprit: “With increased regulatory pressure to ‘unbundle’ the costs that execution brokers absorb on behalf of their clients, the times of terminals or research being picked up by the execution brokers are in the past,” he predicts.
For the brokers, creeping customization demands will present more of a philosophical struggle that many infrastructure providers also face. “The closer you move to the proprietary heart of a given trading business, the more emphasis there will be on flexibility, which is hard to maintain without sacrificing performance. You find yourself pressured to become more of a consulting shop, a platform that is then customized for a particular firm’s use,” Taylor says. “We’ve looked at more proprietary calculations for clients, but for the most part, we identify common computations that they do downstream on the ticker plant to compute different views of data—and then by implementing those in our hardware-accelerated box, we don’t trade off performance. But we’re still not so far as giving you a parameterized application that you feed data into, and use as your trading strategy. We haven’t gone there because, again, it’s really a consulting business model. Our focus is a fast product using superior technology.”
Beyond Ridiculous
Given these dynamics, sources say it isn’t hard to see why market data management is something of a third rail for the sell side: it’s a cost gamble for the independents, and a potential mess for the tier-one firms to do well—both losing propositions. As Taylor concludes, “Right now, between cost, complexity and relationship sensitivity… there are three or four reasons, where you only need one.”
And yet interest still percolates. One way to square it, sources say, is to keep refining the sell-side’s order entry proposition in the short term, to the point where a buy-side firm’s affinity for its broker may justify taking the next step, or to wait until market forces (or new rules) effect a change in exchange fees that makes sell-side management an economical proposition.
Field points out, for one example, that the lack of meaningful back-testing for most algorithms is still surprising. “The buy side always walks a delicate balance, and ultimately asset managers and hedge funds need to demonstrate that they can return value to their investors. That might be a lower-cost version of an existing alpha or beta model, or a genuine edge over the pack. But the regulatory environment has led to a far greater focus and transparency on execution, and on the data itself,” he says, adding that a “structural shift” towards specialization is under way.
Like that Apple Watch, it’s a matter of forging the right relationship over years, with the right benefits relative to cost. Call it a tech problem, but it’s also a matter of comfort and convenience.
As Durkan puts it, “It’s a heavy lift, and perhaps we’re not ready for it yet. Five or 10 years ago, though, offloading core data infrastructure would have seemed ridiculous, and now the industry is used to it. I see that kind of change continuing.”
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
Waters Wavelength Ep. 295: Vision57’s Steve Grob
Steve Grob joins the podcast to discuss all things interoperability, AI, and the future of the OMS.
S&P debuts GenAI ‘Document Intelligence’ for Capital IQ
The new tool provides summaries of lengthy text-based documents such as filings and earnings transcripts and allows users to query the documents with a ChatGPT-style interface.
The Waters Cooler: Are times really a-changin?
New thinking around buy-build? Changing tides in after-hours trading? Trump is back? Lots to get to.
A tech revolution in an old-school industry: FX
FX is in a state of transition, as asset managers and financial firms explore modernizing their operating processes. But manual processes persist. MillTechFX’s Eric Huttman makes the case for doubling down on new technology and embracing automation to increase operational efficiency in FX.
Waters Wavelength Ep. 294: Grasshopper’s James Leong
James Leong, CEO of Grasshopper, a proprietary trading firm based in Singapore, joins to discuss market reforms.
The Waters Cooler: Big Tech, big fines, big tunes
Amazon stumbles on genAI, Google gets fined more money than ever, and Eliot weighs in on the best James Bond film debate.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.
New Bloomberg study finds demand for election-related alt data
In a survey conducted with Coalition Greenwich, the data giant revealed a strong desire among asset managers, economists and analysts for more alternative data from the burgeoning prediction markets.