Chicago-based futures trading software vendor Trading Technologies (TT) had spent over two decades building out its flagship X_Trader platform. So, when the vendor decided to completely change course and phase out the monolith, hard-install trading platform in place of a software-as-a-service offering built using a cloud architecture, it surprised some, including people within the four walls of TT.
By 2012, when the company made this decision to change everything, cloud computing was hardly new—at this point, most in the capital markets had, at a minimum, begun experimenting with private cloud, and some of the larger sell-side firms were kicking the tires on hybrid offerings by partnering with public cloud providers like Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP) and IBM Cloud. The decision to make such a bold move and put the company’s lifeblood on the cloud, though, was—to say the least—unique.
And to be sure, TT had to learn some lessons along the way.
“Being the first mover, in this case, can have a downside because you become so invested and have so much inertia in one way of building something—and a commitment to one tech stack or one platform—that later entrants can do what took you 80 months and 100 people by clicking a button because Amazon automated all of it,” says Rick Lane, the CEO of Trading Technologies.
In those early days, it was also more than just hours lost—sometimes it was dollars and cents. Lane recalls a time when the vendor made a small change to the way it stored tick data in one of its databases. He says it was a relatively innocuous change—“no one really though much about it”—but the next month TT received its storage bill and noticed it was $50,000 higher than the previous month. It turns out that small change cost the company a lot of spare change.
“Anytime we’re going to spend $50,000, we scrutinize it a lot,” he says. “This was something that slipped through the cracks, as simple as could be, and it was another reminder that you have to make sure you understand the impact of turning on these incredibly easy-to-consume services and that you don’t forget about them. When you turn them on, you have to make sure they’re tagged and associated in the right way so that when someone comes through and tries to audit everything, they know whose spend belongs to which team, and that sort of thing.”
While tools have since entered the market to help firms more accurately keep track of cloud usage and spending, there are a litany of other obstacles that firms need to manage when moving to a public cloud.
This article will not discuss the benefits of the cloud—that’s a story that has been well documented already and with each passing day, more and more firms are looking at new ways of using the cloud and cloud-based tools. (See Chart 1)
Rather, this story is about some of the pitfalls of using the public cloud. Of course, there are security concerns—this, too, has been well documented—but this is also an aspect that has been a bit overblown. Yes, firms can create massive vulnerabilities if they cut corners when architecting for the cloud, but the big four public cloud providers also have the resources to combat new threats that Wall Street firms simply do not have.
Beyond security—which will still be covered—early adopters of the public cloud have also had to learn lessons around migration strategy and architecting, cost management, how to properly deploy open-source tools and, crucially, how to change the culture of the firm while trying to attract new talent with unique skillsets. And all of these considerations combined will help to inform a firm which of the public cloud provider(s) to use. (See Chart 2 and BOX at bottom of the story.)
Trading Technologies is far from alone in moving to the cloud—in fact, it represents one of the early adopters in this great migration. Here are some of the lessons that have been learned along the way by banks, buy-side firms and vendors, alike.
The Process
The Blackstone Group is one of the largest private equity firms in the world. As such, it is in a unique position as both a company that builds its own technology solutions for internal use, and one that invests in cutting-edge companies—more than 95 of them, with $82 billion in combined annual revenue, according to the firm.
In the summer of 2018, Blackstone took chief information officers (CIOs) from its portfolio companies out to Silicon Valley in order to conduct back-to-back meetings with Amazon, Microsoft and Google. Bill Murphy, Blackstone’s CTO, says that about two years ago, the CIOs’ general attitude toward the public cloud was that it was an interesting proposition, but it wasn’t a near-term inevitability for them.
This time, he says, the attitudes had shifted and moving some parts of their operations to the public cloud was a short-term priority for almost all of the CIOs who went on the trip.
According to a recent report produced by consultancy Tabb Group, the majority of firms across the buy side, sell side and at exchange and trading venues plan to increase spending on cloud in 2019. At the annual Waters USA event, which was held in early December 2018 in Manhattan, delegates in the audience—of which there were over 200—were asked how much of their workloads are in the cloud versus on premise: 54% said that between 0% and 25% of workloads were in the cloud, 23% said between 25% and 50%, 12% said between 50% and 75%, and 12% said between 75% and 100%.
Murphy says that for Blackstone specifically, it falls into that 25% to 50% bucket. The concept of flipping a switch and the cloud magically appears is a fallacy, he says. He doesn’t view the cloud as a destination; rather, it is the cloud coupled with good governance that is the destination, which often gets overlooked.
“You can do the cloud wrong; it’s not like you just suddenly press one button and you’re in the cloud. You can misconfigure the cloud instances and create massive vulnerabilities, so you need to have proper governance when you move there and be worried about security, just as much as if it was in your own private datacenter. Hopefully it will be a little easier because some of the tooling is built in for you,” Murphy says. “It’s just a workshop and you can use tools wrong. You can use a saw to make a beautiful piece of furniture—or you could cut your arm off. You have to be careful with how you’re doing it.”
What further complicates matters is the sheer scale of updates released on a yearly basis by the big public cloud providers. Blackstone uses AWS, Azure and GCP for various aspects of its organization, and Murphy doesn’t have anything bad to say about any of them—they each have their strengths and weaknesses. But when using a multi-cloud model, it’s important to stay abreast of the newest updates, which poses unique challenges.
“It’s never been harder to stay abreast of everything that’s happening as it relates to the technology world. For the CIO or CTO, I think the job is more complicated than it’s ever been because AWS, Google, and Microsoft are each releasing probably 1,000 new features each year. Just reading the details of 1,000 new features every year is a challenge. I don’t know the great answer [to that question],” Murphy says.
- LISTEN: Murphy discusses his firm’s migration to the public cloud and what he views as being best practices for going to the cloud. Click here to listen.
It is for this reason that Blackstone has invested in Cloudreach, a cloud consulting firm, from which it hopes to have some of these questions answered. “There’s a dearth of that expertise in the market, so staying up on everything that they’re doing innovation-wise is a challenge and you’re probably going to leave some innovations on the floor that you could have picked up just because of awareness. So if anyone has great awareness strategies, we’re all ears because we want to make sure that we’re taking advantage of the latest [breakthroughs],” he says.
Compare and Contrast
While New York-based derivatives software vendor Numerix has established itself in the risk analytics space, it has also slowly been expanding its field of expertise over the past few years thanks, in part, to embracing the cloud. The firm had been playing around with the cloud for several years—using Azure for some internal workloads—but about two years ago it started to utilize the public cloud for production deployments and for its managed services. For this latest endeavor, it switched over to AWS, though it still does support Azure for specific customer requests.
As mentioned before, expense is always a concern. Benjamin Meyvin, senior vice president of managed services for Numerix, says that, by and large, the prices of Azure and AWS are very close. Beyond cost, there are many other aspects that led Numerix to select AWS for its managed services business.
At the top of the list was disaster recovery and compliance. The vendor, for example, has a significant presence in Singapore. When Numerix was making its decision, Azure had only one datacenter in the Southeast Asia city-state—AWS had three.
Also, Meyvin says he felt that from an operational perspective, AWS was stronger. With AWS, he says, if you have a question, the online documentation “is fool-proof; it’s correct. If you follow the instructions, you get the desired outcome.” He says that with Azure, often its documentation would be “contradictory” and he would have to go to online forums and blogs to piece together the information he needed.
Additionally, there’s the concept of dedicated instances: For example, if Numerix is running workloads on a particular virtual machine, no other companies/users are allowed to use the underlying server at the same time, which a client or local jurisdiction might require. Meyvin says with AWS, all virtual machines provide this option for an added price; with Azure, it’s not fully supported and for instances where it is, the cost can be prohibitive.
But where Azure shines is when it comes to contract flexibility. For most companies—i.e., companies the size of Numerix and not National Security Agency government contractors—AWS takes no liability and essentially offers a take-it-or-leave-it contract, Meyvin says. Microsoft, on the other hand, will sit down and craft an agreement that works for both parties and offers things like license mobility and the ability to bring a firm’s own licenses over to the cloud.
“At the end of the day, it becomes this balancing act of commercial considerations, legal considerations and technical capabilities—and I guess that’s why both companies are so successful,” he says.
Truth in Numbers
Perhaps the top challenges when converting to a public cloud model are cultural and philosophical. Meyvin notes that today there are good tools to measure cloud usage and spend. Where the surprises come, though, is in instances of treating a cloud architecture like a legacy IT architecture.
When the business and technical communities start talking about the number of cores they will need or the size of memory, red flags can pop up because these metrics tend not to be sufficient to produce any kind of sensible estimate of future costs.
“When you reach out to Microsoft or AWS and ask for any kind of estimate, if you call up and say that you need 500 cores and ask how much that will cost, they directly answer the question—they will tell you, truthfully, how much 500 cores are, but this is like going into Best Buy and buying a motherboard and nothing else. So 500 cores, $500—but then you need everything else and that’s another $5,000,” Meyvin says.
Meyvin is personally responsible for cloud spend in his group. The way he has tried to control cost is through automation and proper governance, but it took some trial and error to figure out the best model.
Originally, Numerix gave users the ability to request cloud resources—such as servers and tools—and then it was up to that user to release those resources once they were done using it, which rarely happened. After the self-service model failed, Numerix created a schedule where cloud resources were made available for eight hours per day, four days per week. That didn’t work, because what was good timing for New York was inconvenient for Singapore, and what worked for Singapore didn’t work for London, and so on and so forth.
Numerix has since settled on a model where authorized users—it’s not a free-for-all—can request resources and essentially get a two-hour “lease” on that server or set of servers. The way it works is the users request cloud resources via an automated email system. If the user forgets to shut down the environment, that set of resources will automatically shut itself down after the two-hour lease expires. Or, the authorized user can send a follow-up email requesting another two-hour block, which will automatically reset the timer.
“That’s when I finally got myself out of the business of having to argue with my colleagues over how much they can have; it’s all self-service and audited—I get a daily report on who is using what and for how long in terms of hours and dollars. I don’t need to do anything at all,” Meyvin says. “This is crucial: There is an organizational cultural shift when it comes to cloud technologies, because at first you go through this space where people don’t know much about the cloud. Then, all of a sudden, everyone in the organization thinks that they’re a cloud expert because they’ve had many meetings and have seen many PowerPoint presentations. Once the gates are open, the number of requests for cloud resources that you begin to receive is staggering. All of a sudden, all of those constraints that existed somehow they don’t exist any longer. The more people taste it, the more they like it.”
At Scale
One of the more interesting cloud programs being undertaken in the financial markets is unfolding at Bank of America (BofA). Via a program originally called Project Greenfield—now Bank of America Cloud—the financial services behemoth aimed to host 80% of its global compute workload in the cloud by the end of 2019, and to achieve a 20:1 compression ratio in hosting. The project, which began in earnest two years ago—though the seeds were planted in 2013—looks to significantly lessen the bank’s footprint of more than 60,000 physical machines in 36 datacenters.
To achieve this, it would have to move at least 3,000 operating systems per month to the new environment—an environment that is different from the traditional public cloud model that is mentioned above because it is technically a private cloud, but given the institution’s size and scale, it mimics the benefits of a public cloud, and it should also be noted that it does utilize Azure to run a few applications in “a controlled public cloud environment where it makes sense due to business requests,” says BofA’s CTO, Howard Boville.
With a workforce of 200 employees, the bank has seen months where 8,000 operating systems were migrated to the cloud. Additionally, with over two-thirds of earmarked systems migrated, there are between 12,000 and 14,000 systems in a planning or execution phase for migration at any moment—it’s an extraordinarily large undertaking.
As might be expected, Boville points to the human-capital element of the project as key in making this work. He advocates creating a governing body—which will be appropriate for even small institutions—that includes “representation across all areas of the enterprise, while remaining customer-focused and managing risk appropriately.”
Additionally, he says it is necessary to create an education program, since this is a significant paradigm shift. “Cloud migration and implementation is a cultural change requiring full training and communication planning to support the migration strategy,” Boville says.
And while it might seem to split from the conversation of culture, it is important to remember that not everything can be appropriately lifted-and-shifted to the cloud, and that comes down to education and culture. BofA’s 2019 80% target wasn’t arbitrary—20% was deemed inappropriate for the cloud, at least at this time. Numerix’s Meyvin echoes this point. “The very first impulse for any company that has an existing technology stack is lift-and-shift. That’s where cost can raise its ugly head,” he says. “There are applications that are simply not cloud-friendly; while they can be deployed in the cloud, they were not designed to work effectively and efficiently in the cloud.”
Blackstone’s Murphy notes that every company will likely have to do some combination of lift-and-shift (while it’s not always ideal, “legacy reasons” make it necessary) and something that is more targeted—sometimes hybrid private–public, sometimes a blend of multiple public clouds, in addition to internal private projects. From Blackstone’s perspective, it would prefer to take a more “thoughtful” approach to take advantage of new capabilities (which lift-and-shift makes challenging) even if these migration projects tend to take longer.
Murphy says moving to the cloud is “all about creating velocity in our technical deployment.” But to bring it back to culture, it is vital to have people in place who have done this before, and sometimes that means needing to bring in outside experts to help move the process along, otherwise, velocity will be hindered.
Blackstone has looked to both hire developers with cloud expertise and it has, as Murphy previously mentioned, invested in a cloud specialist, Cloudreach, to help it move from the left side of that development bell curve to the full-production right side. Murphy compares it to taking a basketball team that is well-oiled and has been playing together for years, and then telling them that they’re going to have to play rugby—sure, they’re athletic and will learn what to do, but it will make things easier if you have some specialists to help in the transition. Hence the investment in Cloudreach.
Aaron Painter, CEO of Cloudreach, which launched in 2009, says there are different skillsets needed to be able to architect for different cloud environments. As a result, if a company is looking to deploy a multi- or hybrid-cloud strategy, having players with specific skillsets will make the process easier.
“One of the biggest, unforeseen challenges that people don’t take into account is the culture change of their organization. People often think of this as a business change or a technology change, but it’s more about the cultural transformation that they’re trying to drive in their organization,” Painter says. “Things change so fast that even the smartest, most well-intentioned person has a hard time keeping up because there’s so much [that is] new every week.”
And while doing everything from scratch and learning from trial and error is something of a Wall Street axiom, when it comes to the public cloud, many others have already gone down this path and have developed expertise after taking their lumps in real-world situations, Murphy says.
“If I could do it all over again, I’d go faster bringing in experts to help my team get up the learning curve,” Murphy advises. “I think we’re getting there—and certainly we’re in a good spot now—but if we had brought in the experts sooner we probably could’ve made more progress. The number one thing is don’t assume you can do it yourself—because you probably can, but it’s just going to take you way longer to learn it yourself than if somebody is teaching you who’s been there before.”
Navigating the Matrix
Northern Trust began its cloud experimentation in 2008, building private environments at a time when financial services firms were still leery of the cloud, to say the least, and the idea of public clouds for the capital markets was still somewhat anathema.
Joerg Guenther, chief information officer for Northern Trust Corporate & Institutional Services, says that back then, public cloud options weren’t ready, but today they can handle the production loads and instances, while providing the robust security required for a financial services firm’s business-critical applications.
For its private-equity blockchain solution, it originally partnered with IBM Cloud, but then switched over the Azure. On the front-office solutions side, it is using the AWS ecosystem to develop solutions that ultimately will be integrated into Matrix, the bank’s new global asset servicing platform. It represents Northern Trust’s first product that is exclusively running in public cloud, Guenther says.
Matrix will take advantage of containerization techniques that are becoming increasingly popular when used in conjunction with the cloud, such as Kubernetes and Docker frameworks. For the project, Northern Trust has also partnered with Pivotal Cloud Foundry, a platform-as-a-service offering that is cloud native. The architecture lends itself to be distributed and deployed to a cloud environment, he says. It is vital to have a strict process in place for deciding what should and shouldn’t be moved to the cloud—what can be lifted-and-shifted, and what will require hands-on architecting.
“It’s really important, for every application you want to migrate, that you perform what I would call an ‘application disposition’—you actually understand if this application is cloud-enabled, if it is suitable for the cloud, or if it is something that should not be moved to the cloud,” says Guenther. “You need to understand what is a good application for cloud deployment, and what is not. If you make mistakes in that early stage, you end up either reinforcing applications, or you just pay the price by having to redesign it again and again.”
While oversight and culture might seem straightforward, they are anything but. “A lot of people underestimate how they demonstrate governance, control and oversight in a public cloud framework,” he says, but it does take a different way of thinking about connections and architecting. “It depends on your use-case: What data is there and what business is it actually touching?”
The Beginning
In Italian families, with variations told in other cultures, there’s a well-known tale about the importance of the details of a project. The story goes something like this: Enzo was a renowned architect and built the finest homes in Calabria (or whichever province the teller’s Italian grandmother is from). His mitering was perfection, the nails invisible. Everything fit like a glove.
After 30 years of building homes for a private contractor, he was ready to retire. His boss pleaded with him to stay on, and Enzo agreed to build one more house. Considering that he had one foot out the door, he decided to cut some corners. The house was fine, but hardly lived up to his standards. When it was completed, the contractor thanked Enzo for his service, and gave him the house, fully paid for. The moral? Don’t cut corners.
While not Italian, Blackstone’s Murphy appreciates the gist of the story, but offers a different take. Some might like to think that cloud is easy—plug it in and the cost savings will flow like the River Po. That is not a reality. The key is to pay close attention to detail.
“Everybody wants to believe it’s cheap and easy, so the first thing I would do is refute that, and say you can do some proofs-of-concepts to learn, but don’t let the proof-of-concept become the end-state—that’s the biggest risk that I see, because people can get enamored of the quick win. So do some proofs-of-concept to take a first step, to learn, to get your staff familiar, and then go back and say, ‘How am I going to get there with my broader environment? How am I going to govern and really have a strategic approach?’” Murphy says. “And if you can afford to, go the slow, methodical way. You could put a house up in a weekend if you really needed to, but it would probably be bad quality. If you take your time and do it right and you wire for the future and put in the proper insulation, you could probably live there forever.”
BOX: Couples Therapy
While the topic can get a bit hazy, not only are firms increasingly embracing public cloud providers, but they’re also starting to explore working with multiple cloud providers at the same time. As Blackstone CTO Bill Murphy notes above, the private equity behemoth uses AWS, Azure, and GCP for different functions of the business. Many other larger firms are kicking the wheels on a multi-cloud strategy, including the likes of Axa, Credit Suisse, New York Life Insurance Company and TD Bank.
But on the whole, the industry is more geared toward using a single public cloud provider, though they are likely working with vendors that use a cloud-delivery/as-a-service model, and may also have their own internal private clouds. According to Monica Summerville, head of fintech research at Tabb Group who wrote the research report mentioned in this article, they only found that 31% of firms currently have multi-cloud strategies for public cloud.
“That is not to say the firm won’t have employees making use of different clouds, but from a corporate perspective, multi-cloud needs to be managed on an enterprise level, and that is not the case at the majority of firms at present,” she says, adding that, for those who are coupling providers, AWS and Azure are the most likely to get paired. Google is in third, and making strides in London. IBM is finding success in the regtech and blockchain spaces and has recently announced that IBM Watson services, including Watson Assistant and Watson OpenScale, are now available on AWS, Azure, GCP, and any other cloud.
Further reading
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
FactSet launches conversational AI for increased productivity
FactSet is set to release a generative AI search agent across its platform in early 2025.
Waters Wavelength Ep. 295: Vision57’s Steve Grob
Steve Grob joins the podcast to discuss all things interoperability, AI, and the future of the OMS.
S&P debuts GenAI ‘Document Intelligence’ for Capital IQ
The new tool provides summaries of lengthy text-based documents such as filings and earnings transcripts and allows users to query the documents with a ChatGPT-style interface.
The Waters Cooler: Are times really a-changin?
New thinking around buy-build? Changing tides in after-hours trading? Trump is back? Lots to get to.
A tech revolution in an old-school industry: FX
FX is in a state of transition, as asset managers and financial firms explore modernizing their operating processes. But manual processes persist. MillTechFX’s Eric Huttman makes the case for doubling down on new technology and embracing automation to increase operational efficiency in FX.
Waters Wavelength Ep. 294: Grasshopper’s James Leong
James Leong, CEO of Grasshopper, a proprietary trading firm based in Singapore, joins to discuss market reforms.
The Waters Cooler: Big Tech, big fines, big tunes
Amazon stumbles on genAI, Google gets fined more money than ever, and Eliot weighs in on the best James Bond film debate.
AI set to overhaul market data landscape by 2029, new study finds
A new report by Burton-Taylor says the intersection of advanced AI and market data has big implications for analytics, delivery, licensing, and more.