Case study
BlackRock tests ‘quantum cognition’ AI for high-yield bond picks
The proof of concept uses the Qognitive machine learning model to find liquid substitutes for hard-to-trade securities.
Asset manager Saratoga uses AI to accelerate Ridgeline rollout
The tech provider’s AI assistant helps clients summarize research, client interactions, report generation, as well as interact with the Ridgeline platform.
BNY uses proprietary data store to connect disparate applications
Internally built ODS is the “bedrock” upon which BNY plans to become more than just a custodian bank.
Deutsche Bank experiments with regulatory GenAI tool
Project Aggie can complete what business domain experts typically do in a few hours in under five minutes, the bank says.
Fidelity’s quantum exploration unites theory and proof
The asset manager and Amazon have teamed to put a quantum twist on machine learning.
Northern Trust adds fixed-income capabilities for outsourced trading in Asia-Pacific
The custodian bank now offers 24/6 fixed-income trading coverage with desks in Chicago, London, and Sydney.
Kepler Cheuvreux builds proprietary execution platform with Adaptive
The broker wants to move away from third-party technologies as DORA’s risk management requirements could make vendor relationships more cumbersome.
BMO’s cloud migration strategy eases AI adoption
The Canadian bank is embracing a more digital future as its cloud strategy makes gains and it looks to both traditional machine learning and generative AI for further augmentation.
How Ally found the key to GenAI at the bottom of a teacup
Risk-and-tech chemistry—plus Microsoft’s flexibility—has seen the US lender leap from experiments to execution.
Chris Edmonds takes the reins at ICE Fixed Income and Data Services
Edmonds is now leading ICE’s fixed income and data business as the rush to provide better data and analytics in fixed income builds.
Man Group CTO eyes ‘significant impact’ for genAI across the fund
Man Group’s Gary Collier discussed the potential merits of and use cases for generative AI across the business at an event in London hosted by Bloomberg.
The bank quant who wants to stop genAI hallucinating
Former Wells Fargo model risk chief Agus Sudjianto thinks he has found a way to validate large language models.
Man Group’s proprietary data platform is a timesaver for quants
The investment firm’s head of data delves into its alt data strategy and use of AI tools to boost quant efficiency.
Vanguard cautiously explores neural networks for alt data analysis
John Ameriks, head of Vanguard’s Quantitative Equity Group, explains the rationale behind dataset selection and how the group has been using machine learning.
Citi details API for HKEX’s Synapse
New pieces of technology, like Synapse, assist Citi in migrating clients to newer technologies, and newer ways to settle and clear more efficiently.
Alliances and experiments: Trading firms get innovative in 2023
Rebecca offers a recap of the year's most notable technology use-cases led by sell-side and buy-side institutions.
How Liontrust AM reimagined tech vendor partnerships to retain IP post-Majedie acquisition
Buying off the shelf can be cheaper and faster than building in-house, but giving up IP rights to critical platforms is a trade-off some firms aren’t willing to make.
JP Morgan fast-tracks market data licensing tools
The bank is developing solutions to help internal teams understand compliant usage and entitlements.
As asset managers look to Asia for alpha, analytics & visualization tools take center stage
PineBridge Investments uses economic and time-series data analytics provider Macrobond Financial to better its economics research, which supports its portfolio managers and the firm’s overall investment thesis.
JP Morgan AM develops AI quant tool that uses mind maps to build thematic funds
ThemeBot uses textual relevance and revenue attribution to construct a list of stocks, which is then verified by JPMAM’s active equity analysts.
Goldman taps alt data for economic forecasts during pandemic
Economists at the bank leveraged a combination of public and third-party data to make conclusions about the future during uncertain times.