This week, I’m going to spend a fair amount of time talking about one story, but only because I think that its premise bleeds into the same underlying concept (and challenge) that faces other large, industry-wide projects. Also, there are interesting things happening in the virtual-desktop space, and there seems to be a lot more crossing-the-street when it comes to tech in the retail and capital market spaces. Let’s get to it.
Isda’s CDM Hits a Bum Note with Banks
Back in 2017, the International Swaps and Derivatives Association (Isda) announced that it would work on defining processes and procedures in trading to a standard, machine-readable format, known as the common domain model, or CDM. At the time, Isda CEO Scott O’Malia had this to say: “The system as it stands is creaky, over-complicated, and outdated, [which increases] cost and compliance burdens for all market participants. New technologies can alleviate many of these problems, but first we need a reform of current standards and practices.”
The idea makes sense: post-trade’s plumbing is everything that Isda says it is, and the CDM could transform data reporting across derivatives trading. Yet trying to solve one old problem inevitably runs into two others: how much it will cost, and who is going to pay for it? While Isda estimates that the CDM could save broker-dealers $3 billion a year in reconciliation costs, the bill for upgrading banks’ infrastructure could run into nine figures.
This week, Barclays released a new paper that addresses this issue, and puts the onus on exchanges and financial market infrastructures (FMI), such as clearinghouses. The bank’s director of research and engineering, Lee Braine, tells WatersTechnology that some banks are having trouble making a case internally for a project that promises to produce savings on post-trade processes, but generates no revenue. If an FMI leads the charge on driving adoption by offering new products or services that require CDM compatibility to access, that could help convince banks, he suggests.
“The business-case challenge is not to persuade all the broker-dealers to migrate off existing functioning internal systems. It is merely to add new services, and that’s an easier sell,” says Braine, who, for my money, might be the best person to talk to in the capital markets on the subject of “future technology”.
Some reticence on the part of the banks is, perhaps, understandable. After all, the history of capital-markets technology is replete with projects that promise the world in terms of efficiency, but require up-front investment to make them a reality. Few end up achieving those goals.
For the CDM, a critical mass of banks is needed, and as our colleagues at Risk.net reported last year, the project is seeing “patchy” participation coming from the sell side. “On the Isda calls, I would expect a healthy number of nine to 10 banks engaged, but there are less than half that. It doesn’t work with just one or two banks—it needs critical mass,” one source told Risk. Isda refuted that tally, but it also didn’t provide a number for how many banks were on the calls.
The Barclays paper lays out eight options that the industry could take to tackle inconsistent data and processes, as well as duplicated data, in the post-trade lifecycle. It’s definitely worth giving this article a read, as it also features a who’s who of the heavy hitters in this area. But, again, the key is going to be bank buy-in. From the story: A senior executive at another investment bank talks of the “vast amount of money” it would take to reconfigure post-trade plumbing and argues savings might not be felt for decades. He says: “The cost of moving off your existing expensive legacy to this new stuff, the business case is like 20 years’ payback … you can’t quite justify the economics.”
CAT & CHESS & Big Ideas
I also want to hit on two things that have nothing to do with that CDM story, but also track with the notion that the best-laid plans of market-structure experts often go awry.
Naturally, the Consolidated Audit Trail—aka, the CAT—comes to mind. In 2012, following the 2010 Flash Crash, the US Securities and Exchange Commission (SEC) approved Rule 613, mandating the National Market System exchanges to begin work on a comprehensive audit trail of market activity. A full decade after the Flash Crash, the CAT is now live after a litany of delays and issues creating the actual platform. In private, when you talk to ops and compliance folks at the broker-dealers, they’re nonplussed, to say the very least, with the way it has been carried out.
With this thing finally starting to creak forward, we’re still going to have to wait some time to see if the output was worth the effort.
I also can’t help but think about the Australian Securities Exchange’s (ASX) distributed-ledger technology (DLT) replacement of its Clearing House Electronic Subregister System (CHESS). (That’s a lot of acronyms in one sentence … sorry.) If successful, in the equity market it would be “a major breakthrough of that technology in one of the major asset classes,” said Axel Pierron at consultancy Opimas back in 2017. At the end of June, the exchange operator set a new go-live date of April 2022, which amounts to a 12-month delay from the original target. Obviously and fairly, the pandemic was cited as a reason for the delay. But it’s also true that when the idea of a DLT replacement was first unveiled in 2016, market participants complained that implementing a shared-ledger environment wouldn’t address their current needs—“it’s a technology shift that doesn’t bring any upgrade at this point,” Pierron told WatersTechnology.
When asked directly about participant concerns about the need for a DLT replacement—again, in 2017—Cliff Richards, general manager for equity post-trade services at ASX, said he understood these concerns, but there was appetite for the project as exchange clients were also examining the potential of the technology.
It’s that last point that is interesting. Yes, at the height of blockchain mania in 2017, financial institutions of all stripes were looking at DLT, either individually or as part of a consortium. But since then, some firms are having second thoughts about the value of DLT-based platform replacements.
You have to understand something about me: I care most about the people who work in tech and data, so I often won’t look to defend these sweeping industry projects because at the end of the day, it’s tech and ops who most often take it on the chin.
“Hey, we’ve got this wonderful initiative that will save your bank a lot of money down the line, but it’s going to cost some upfront investment and hundreds of working-group calls.”
“Oh, cool, just what I need because I don’t have much going on, except for these 12 other reg reporting projects, we’re trying to switch out our OMS, our CEO wants us to explore blockchain technology, the PMs want me to integrate a half-dozen alternative datasets, everybody is bitching that this new vendor analytics platform that we bought doesn’t integrate with our EMS, and there’s an effing pandemic going on and everyone is working remotely … you know, except for me, because I have to come in and make sure the goddamn lights stay on. So please, do tell me about this wonderful new idea you have for the exciting world of post-trade technology.”
The CDM, CAT, CHESS, as well as any number of regulatory initiatives will ultimately make the markets stronger, more efficient, and safer—if successful. From my seat, it feels like the industry often hitches its wagon to an idea, and then figures out what comes next as it goes. While that works well for exciting internal “innovation” projects that generate alpha for an individual firm, industry projects too often—it would seem to me—get stuck in the mud because there’s limited excitement about these projects among the bank tech and ops rank and file, and they’re the ones ultimately charged with building the plumbing for these projects.
I’ve also never worked in tech and ops—I just drink with them—so feel free to tell me if I’m way off base: anthony.malakian@infopro-digital.com.
Tradition Goes Big on Virtual Desktops
Moving on, as far as the day-to-day is concerned, interdealer broker Compagnie Financière Tradition (CFT) is considering moving to a fully virtual-desktop environment, after making a big investment in remote-working technology during the coronavirus pandemic. After shipping hardware to about 600 employees during the months of March and April, Yann L’Huillier, CFT group chief information officer, had this to say to Jo Wright: “I reviewed the costs and found we spent the equivalent of two-and-a-half years of work in a month. Everyone was on it, including me, building laptops and PCs and anything else we needed to work remotely. It was intense.”
One other interesting thing from that story is the fact that IT staff still had to be in the office due to those unforeseen problems. “We had issues one evening when we lost power on a number of PCs and we thought we had a network outage or something. It happened that it was the cleaning staff who knocked down a power supply,” L’Huillier said.
That shit does actually happen—that’s crazy. (Imagine the fear that must sweep across a cleaner’s face after they disconnect a power strip and see a bunch of monitors shut off.)
Also, last month, Mike Dargan, head of group technology at UBS, told Jo about how the firm migrated staff to a VDI setup some years ago, and Dargan says it gave the investment bank an edge during lockdown. That’s also a really informative and entertaining read.
Lines Are Blurring
Our veteran market data reporter Max Bowie gave his thoughts on how the industry is warming up to on-demand data as new technologies are coming to the forefront.
Max also wrote about how alt data provider Apteo has rolled out a new platform called Predictive Insights, which provides key indicators of a company’s financial performance. The vendor anticipates it will help the company to broaden its client base and expand its business to sectors outside financial services.
From the story: Apteo has built a foundation of core functionality that can be used regardless of the data type being interrogated or the industry that the client works in, and can be tailored to the needs of financial professionals or people working in other sectors.
This particular part of the story jumped out for me because it loosely ties into two other stories we published this week. The first one was about how Finos, the open-source nonprofit, is looking to expand into the retail banking space; the second was about how alt data provider TruFactor, which first specialized in urban planning and ad targeting for telco companies, is now in talks with buy-side firms as it looks to expand into the capital markets.
I was thinking that as more banks and asset managers turn to the major public cloud providers—thanks in part to the industry’s embrace of open-source tools and APIs—and as the alternative data space continues to explode, we’re going to see tech and data providers look to expand their sectors of coverage and not be so industry-specific.
Max says that he believes we’re seeing the emergence of generic “business intelligence” solutions that anyone from any background can use for any function, and integrate any dataset. Max has a lot more experience than me, so I’ll let him have the final word. See you next Sunday.
Further reading
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
LSEG rolls out AI-driven collaboration tool, preps Excel tie-in
Nej D’Jelal tells WatersTechnology that the rollout took longer than expected, but more is to come in 2025.
The Waters Cooler: ’Tis the Season!
Everyone is burned out and tired and wants to just chillax in the warm watching some Securities and Exchange Commission videos on YouTube. No? Just me?
It’s just semantics: The web standard that could replace the identifiers you love to hate
Data ontologists say that the IRI, a cousin of the humble URL, could put the various wars over identity resolution to bed—for good.
T. Rowe Price’s Tasitsiomi on the pitfalls of data and the allures of AI
The asset manager’s head of AI and investments data science gets candid on the hype around generative AI and data transparency.
As vulnerability patching gets overwhelming, it’s no-code’s time to shine
Waters Wrap: A large US bank is going all in on a no-code provider in an effort to move away from its Java stack. The bank’s CIO tells Anthony they expect more CIOs to follow this dev movement.
J&J debuts AI data contracts management tool
J&J’s new GARD service will use AI to help data pros query data contracts and license agreements.
An AI-first approach to model risk management
Firms must define their AI risk appetite before trying to manage or model it, says Christophe Rougeaux
Waters Wavelength Ep. 297: How to talk to the media
This week, Tony and Wei-Shen discuss the dos and don’ts for sources interacting with the media.