Sponsor's Statement: Be in Control
The common thinking behind implementing a reference data management system is that there are two approaches: build your own data model or buy one from a commercial vendor. The premise for buying a commercial data model is that such as product is a superset of all the fields, attributes, relationships, vendor adaptors, etc, of the client¹s requirements to make implementation faster than building it yourself. The data model will then be maintained and upgraded by the vendor as new feeds, asset classes, relationships, identifiers and downstream systems need to be supported. All sounds good!
The important thing to understand with this proposition is the market segment the solution is relevant for. If we are talking about smaller asset management companies and hedge funds with a few feeds and asset classes, the commercial data model might be a good fit and have the required superset of attributes, adaptors, relationships, etc, built into its model. However, if the problem seems too small (say a couple of feeds for a couple of asset classes), then it might be easily addressed by a cheaper home-grown solution, not a mallet to crack a small nut.
Conversely, at the other end of the market, in global investment banks, custodians and investment managers, the likelihood of the commercial data models having a superset of the data requirements is extremely low. This is particularly true when you consider the impact of globalization and the growth in importance of emerging markets and niche/local data vendors on the Tier 1 organizations. The footprint of these organizations across the world will never be matched by the ³superset² theory of the commercial data models.
Another major issue with commercial data models is how these models are managed through the full lifecycle of upgrades and evolutions over 15-20 years. A core element of the commercial enterprise data management (EDM) value proposition is based on having a set of data vendor adaptors maintained and upgraded together with the underlying model (can¹t have one without the other). This requires the underlying data model to be upgraded to support the upgraded adaptors and vice versa.
It is interesting to contrast the data model upgrade experience with other domains, such as the world of enterprise resource planning (ERP) applications. Reference data in financial services firms can be easily defined as being very dynamic (multiple vendors/data sources, identifiers, cardinalities, and conflicting classifications with lots of changes driven by data vendors, the business and regulators). Contrast this with the ERP world (relatively static inputs determined mainly by the application vendor) and you can see it is a very different data management problem and therefore a data model maintenance problem.
The maturity and capability to upgrade ERP applications isn¹t just a function of the billions of dollars spent on R&D (that helps obviously) and almost 40 years of experience. The fundamental difference is that in ERP implementations the application vendor largely controls what data is input, stored and managed, making it a vastly simpler data management issue. The underlying ERP data problem requires periodic large-scale upgrades to the data model, or major releases that are in themselves large-scale projects.
The EDM vendors have for too long tried to make the world believe a far more complex data problem to ERP could successfully mimic its approach to large-scale upgrade of data models. This is why you don¹t often hear of a routine upgrade to a security master implementation, but it turns into a new implementation and vendor selection process. And no matter how much a firm ³implemented according to best practice and vendor guidelines,² it is not going to be a straightforward upgrade as advertised. Think about it, in the ERP world, the application vendor controls the data inputs and upgrades are notoriously difficult. In reference data management, nobody controls the data inputs (N-different data vendors), so upgrades are by definition more complex than the most complex ERP upgrades. So, is the easy upgrading of vendor reference data models an unsolvable problem? Quite possibly.
This is what attracts globalized firms to build and maintain the data model themselves. Without a virtually automatic upgrade path (which doesn¹t exist), a commercial data model can be a liability. It can constrain the business and make it run to someone else¹s timeline and priorities. Large firms need to own their model and control its destiny, not for competitive advantage as argued in the past, but to keep control of its evolution for a globalized business. They also understand the difficulty in a model upgrade path and do not want to be tied to monolithic releases promised at some stage in the future. Their businesses are just vastly more complex than anything they can buy off the shelf. Also, very few large firms think they can learn a lot about data modelling the securities business from a third-party software company, and they are probably right.
At PolarLake we believe there is a third way. If you would like to learn how these classic problems can be addressed through genuine technology innovation and the use of semantics technologies please get in touch. Visit our website at www.polarlake.com.
John Randles is CEO of PolarLake.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Tech VC funding: It’s not just about the money
The IMD Wrap: It’s been a busy year for tech and data companies seeking cash to kick-start new efforts. Max details how some are putting the fun into fundraising.
BNY uses proprietary data store to connect disparate applications
Internally built ODS is the “bedrock” upon which BNY plans to become more than just a custodian bank.
Waters Wavelength Ep. 296: Questions about data quality
It’s all about the data, data, data.
The AI boom proves a boon for chief data officers
Voice of the CDO: As trading firms incorporate AI and large language models into their investment workflows, there’s a growing realization among firms that their data governance structures are riddled with holes. Enter the chief data officer.
FactSet launches conversational AI for increased productivity
FactSet is set to release a generative AI search agent across its platform in early 2025.
If M&A picks up, who’s on the auction block?
Waters Wrap: With projections that mergers and acquisitions are geared to pick back up in 2025, Anthony reads the tea leaves of 25 of this year’s deals to predict which vendors might be most valuable.
ICE Connect adds data integration capabilities for proprietary data
Intercontinental Exchange’s desktop platform is collaborating with CloudQuant to allow customers to integrate in-house data and analytics with the datasets found on its ICE Connect platform.
MIAX taps DataBP for exchange data licensing, custom contracts
To support planned growth of its data business, the exchange group has implemented DataBP’s platform to strengthen its licensing process and scale up its distribution capabilities in anticipation of end-user demand.