Three’s Company: Platform Providers Get In On The Act

Some firms are seeing benefits in having core platform suppliers manage their data requests, rather than maintaining direct relationships with data vendors.

three-people-idm1216

“I’m sorry. For now, I just need to see what everyone else is seeing.” 

There was a time in the early days of electronic market data, in 1989, when a Reuters salesperson, trying to persuade a money manager away from Telerate’s government debt reporting service, heard those very words. It was a teaching moment. Even though the Reuters product was superior, as the client admitted, bucking the trend just wasn’t worth the risk. 

Fast-forward to today: Telerate is long gone. And Thomson Reuters—its buyer, now a buy-side giant in its own right—is still tackling that same annoying question, but now against an even bigger behemoth across Midtown Manhattan. 

But what has dramatically changed is the client. Many firms are now giving enthusiastic second thoughts to how they access their market and reference data, and asking questions—from whom, for how much, and to what end—that may have seemed off limits before. Some are concluding that direct relationships, even those governing some of their firm’s largest and most crucial cost centers, aren’t necessary, and instead they are farming out data consumption management and related activities to one of several other partners: their front-office order management platform, a managed data service, their custodians, and others.

For a variety of reasons, no one Inside Data Management interviewed for this story characterized that shift as an immediate problem for incumbent core data—at least not yet. Rather, firm and vendor sources tell IDM that it simply reflects the expanding influence of a new mindset. Investment managers’ stronger disposition toward as-a-service models, transformation initiatives, and willingness to move more operational functions “outside of the shop” are all creeping toward market data feeds and terminals. While this doesn’t spell the end for the old guard, clients’ new comfort with intermediation certainly has them paying attention.

“I do not see it as significant disintermediation; rather, the dimensions of the conversation between client and vendors are changing.” Brian Buzzelli, Acadian Asset Management

New Realities

Some in the industry say this trend is fairly limited—that established relationships are for the most part unchanged. Others say it offers a glimpse of what’s soon to come. But most agree on certain new realities that are driving the trend along—that the talk is demonstrably different.

“I do not see it as significant disintermediation; rather, the dimensions of the conversation between client and vendors are changing,” says Brian Buzzelli, senior vice president and head of data governance at Acadian Asset Management, a Boston-based investment firm with $69 billion under management. “Many discussions expand beyond data and features, and include other third parties, a focus on reshaping our data management operational model toward greater efficiency, lower cost and higher value, and increasing consideration and potential use of industry utilities, expanded business process outsourcing (BPO) and hosted propositions,” he says. “They typically include more than two parties, require new contractual frameworks, and have significant impact on management decisions that are fundamentally tied to vendor data licensing. Still, vendors and clients have every opportunity to engage, and while the client side recently may be more comfortable with a third-party data management services and utilities, vendors recognize the efficiencies and economies of scale these models may offer to financial firms, and will need to respond.”

To understand why, start with where spend is growing. Research from Burton-Taylor International Consulting indicates that recent year-on-year market data spend is relatively flat, with modest 1 to 2 percent annual increases among core providers, accounting for fees. By contrast, within those numbers, risk and compliance data is seeing growth as high as nine percent. 

Douglas B. Taylor, the consultancy’s founder and managing partner, says that a wide array of providers from those areas are likely seeing that same projection, and jumping in. If clients are prioritizing less raw data consumption and doing more around its manipulation and analytics in these areas, they are naturally asking vendors with the best tools to perform these activities—who are traditionally not the data suppliers, themselves—about managing data consumption, too.

Variations on a Theme

The reasons behind those numbers are manifold. Sources point to the current environment of compressed margins, with most firms looking to cut costs and many also searching for new sources of alpha—diversification, hedging with exchange-traded funds, and other risk-chasing strategies—even while they are forced to commit more spend to regulatory compliance. 

Sitting between these priorities is an organizational change, as well: adoption of the chief data officer role and far more investment in and scrutiny of enterprise data governance, particularly as it drives transformative projects like investment book of record (IBOR) and reference data standardization initiatives. “These people are now saying, ‘We’re not buying data you’re not using,’” says Martijn Groot, vice president of product strategy at Asset Control. “You’ll now have these layers developing: shared data, enterprise data that stays inside, and environmental datasets that are specific to client portfolios, all with a closer sense of where data gets used. On top of that, we’ve also seen more utilities and even clearinghouses making new data more freely available.”

Accordingly, areas like data consumption, where firms increasingly see little value-add, are now up for debate. 

“Because of foundational work in the industry in recent years, firms think of data as more of a commodity than intellectual property. Upstream data challenges today are born of complexity, not volume,” says Liz Blake, global head of front-office solutions at Eagle Investment Systems.

“Ten years ago, you would have to go to 10 sources for benchmarks. That’s now standardized and comes from one source,” adds Subbiah Subramanian, senior vice president of State Street’s DataGX. “Back then, in my previous role with Putnam Investments, over the course of the day we had lots of data coming in, errors piling up, data that was delivered late, duplicates across our functions—all really mundane stuff. That’s one part of the challenge; the other aspect is the technology they’re managing it on, and maintaining the platform. Chief data officers will spend seven hours of their day on this stuff, and asset owners are seeing this, as well. As the CDO at a large pension client recently told me, ‘Anything that is non-alpha generating, we don’t want people focusing on. We want it out.’”

Customization

And so the bet, as former JP Morgan Asset Management CDO Dessa Glasser explains, is actually simple: Mainstay data vendors are in the business of providing content and tools to work with their data, not customizing their services for each client. “This leaves room for firms that can consolidate and provide services across a growing number of data vendors,” she says.

Firms will move to such a managed data service on the basis of their own fund or strategy complexity, the functions they are unwilling to give up, the maturity of their own infrastructure, and potential efficiencies to be gained. Secular trends will drive this, but so too do market events, according to Steve Cheng, global head of data management solutions at Rimes Technologies.

“Data quality and timeliness have always been there, but asset managers are telling us that the ability to be flexible and responsive to change—say, an unexpected result in an election and the subsequent so-called ‘Trump Trade’—is required just to remain competitive,” Cheng says. “Engineering for that in-house is difficult, and will often mean you’re six months behind where you should be. It’s expensive to keep a large bench of reserves for when those things happen, but a specialist can do that.”

Their choice of managed data provider can also stem from the preferred method of transformation. Engineering an IBOR, for example, can begin with the order management system, portfolio accounting or back-office data management platform, and that choice might dictate who is in play. As a result, front-office systems like Charles River, Advent Software and SunGard, data stewards like Eagle Investment Systems and Broadridge, and custodial platforms like State Street Global Exchange (SSGX) are all in the mix. 

Many entrants have been building out their capabilities for the better part of the post-crisis era. State Street launched SSGX in 2013, while Charles River has been at it for six years and adopted a model for its data delivery service more typical of a startup: solution and adoption, then optimization.

“We felt clients were too often seeing the wrong data, and running back to their spreadsheets,” says Charles River data services director Brad Haigis. “So to start with, we partnered with vendors to deliver the data that we would then massage, manage, and deliver into the client portal and lifecycle. We initially started with feeds for validation of security master data: bedrock, foundational security information brought in from Interactive Data and Thomson Reuters, which both later added real-time streaming, before moving to benchmark index data and about a dozen partners on that. Next, we added on corporate actions data as we delivered our IBOR solution, and now it’s on to risk scenario analysis and attribution using historical data—prices, yield curves, and foreign exchange rates. It really evolved out of increasing demand.”

Segmentation

Providers uniformly say that the small and mid-sized shops, with less than $50 billion under management, are the target—usually with market data validation, benchmarks and certain reference data functions coming first, followed later by real-time streaming and corporate actions. Larger managers, they assumed, have the scale and churn of trading activities to justify managing data consumption on their own. But here, too, there are surprises: a growing handful of shops with $100 billion or more under management are either switching to, or pondering, different managed data options.

Some large firms are using the new tactic to reign in the enterprise, notes Acadia’s Buzzelli. “They recognize the cost of ineffective or inefficient data management operations that lead to data inconsistencies and errors, exceed industry and peer costs, and fail to meet regulatory data management standards is too high,” he says. “The days of permitting data silos and data rogues are over.”

Keith Brodhead, senior sales specialist for reference data at ICE Data Services (formerly Interactive Data) adds that key person dependency and quirky customization aren’t novel to small shops. “At a client site visit recently with a $75 billion asset manager, we were invited to speak with their head of fixed income trading. He reported his best analyst was retiring and he has no one to monitor our datafeed transfer and loading into their analytics applications. ‘Whatever we do now, it has to be compatible with an Advent API,’ he told us. Some clients have no way of taking a feed or even file from us directly anymore; they’ve switched entirely,” says Brodhead, a data market sales veteran. “And as all this is happening, we see other direct deals painstakingly negotiated over months and over individual data fields, where I’ll even wonder ‘When is enough, enough? Let’s do this differently.’”

Mutual Dependency

But even as the calculus shifts and interest piques, the true potential of intermediation is still unknown. One reason is straightforward: the bar to achieving benefits is high. “Pure data distribution sources haven’t taken off yet because the assumption was that they would drive overall data costs down, and that hasn’t happened,” says Ivan Matviak, executive vice president and head of North America at SSGX. “There is no big price benefit in a consolidation service on its own, not yet. The value comes in the aggregation, validation, and governance that comes on top of that.”

Building one isn’t an easy lift, either. There is first the matter of working out the licensing details with data suppliers, and building out the necessary infrastructure and team to support the solution. Neither of these comes easily; each implementation is a delicate dance.

“With certain licenses, we can resell from partners—so our clients don’t need to have an IDC or Thomson Reuters relationship to get pricing data, for example—whereas with others, such as with MSCI’s benchmarks, we are just a delivery mechanism,” explains Charles River’s Haigis. “From the vendors’ perspective, this is incremental; for better or worse, we’re a channel to a piece of business they might not be seeing. For a giant like Thomson Reuters [chasing Bloomberg], it might be a nice long-term play: get close enough to the front office with your data that maybe they’ll think about Eikon,” he says, referring to Thomson Reuters’ desktop platform. “But we’ve certainly had those discussions. If there is already another existing relationship, say in the back office, we don’t want to compete with that direct business. Pondering those issues comes with further optimization, and we’re not there yet.”

There are still brighter lines. For example, the energy and investment required to become a legal vendor of record is massive, including interfacing with exchanges and compliance reporting. Indeed, none of the providers interviewed had any intention of trying at this point.

In addition, operating compromises and hybrid frameworks within the largest institutions can also pose a ceiling to the managed data proposition, Glasser says—at least at the very top end.

“Within a firm, some desks may buck the trend and may be efficient in bringing in and analyzing specific data,” the former CDO says. “However, they still need to report their results to the parent to comply with financial, regulatory and performance reporting. So, several firms require the sharing and use of common data across desks, such as common client identifiers and the use of standardized reference data, while allowing the desks to maintain their own derived data—common data transformed using business rules or analytics. This gives the desks freedom to use and leverage their own information, while satisfying firm-wide standards and meeting compliance and reporting requirements.”

Critical Rethink

For now, it may be that the mainstay vendors are safe, and perhaps content to see how the market develops. But Brodhead and Buzzelli both say that the trend should have them on alert, watching out for which flanks they should protect.

Disintermediation of vendors’ direct sales relationships with clients is not as important as reinforcing those between vendor product management and users. But a misalignment there could spell much bigger problems, Buzzelli warns. 

“Vendors and their clients must have strong relationships to foster product and service innovation,” he says. “This all means they may need to critically rethink their approach to client relationships and commercial models that can serve product innovation and fair profitability, while also meeting the client’s efficiency requirements across operational model, cost, access, and use.”

Brodhead’s take is a bit more existential. “For me, it’s a red flag,” he explains. “Beyond our own dataset—strengths like Interactive Data’s muni bond coverage, and commodities under the ICE umbrella—this should have all of us asking ‘What makes us different; if you have six or eight much broader requirements across OMS, performance and attribution, risk and compliance, why go to us and not to SunGard?’ The mainstays can try to adapt to more ‘as-a-service’ models; we can make acquisitions. But can we do that well, when the size of the ships we’re steering—our organizations—means it is much harder to turn on that dime? Very few people can get the data we have, and preferences in this market are cyclical—so I don’t think we become obsolete. Things are going to change, though.”

This could just be the end of the beginning, then—a period of playing nice before the battle for market share. As State Street’s Matviak says, “Our sweet spot is the unique datasets, from private equity, hedge funds, and institutional loan data—niche data, not public markets. SSGX isn’t meant to compete with Bloomberg on vending data.” 

But he can’t help but caveat that someone—or something—will eventually. “Technology evolves, and we’ll see more automation and machine-learning tools and new players able to do what they do,” he says. “That competition is coming.” Or as the Reuters salesman battling Telerate way back when put it: Nothing lasts forever. 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T

Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here