Data Standards Competition Heats Up
Last week saw a great deal of activity in the data standards model field, or at least a lot of publicity from competing models. Open Data Model, a newer organization than the well-established EDM Council, set out classifications based on asset class distinctions as the basis for its model. Open Data's model is based on the ISO 10962 standard and the same organizing principles as Wikipedia, according to Rodger Nixon, chief executive and founder of Open Data Model.
Open Data Model's membership steadily grew in the latter part of 2011, according to Nixon, who contends that his organization's offering and methods are better and more effective than those of the EDM Council. "There are standardizations of the messages and transactions, but our field isn't transactions, it's reference data," he says. "It's not a relational model."
Multiple classifications provide a basis for better dimensional models for data analytics, states Nixon. "You can see the instruments underlying the derivatives," he says. "You're trying to see the underlying instruments in derivatives and ‘explode' them out [for analysis]. What's your exposure? What are the simplest components, the ones that have an effect?"
Meanwhile, in a webcast by EDM Council, managing director Mike Atkin emphasized the uses of its Financial Industry Business Ontology (FIBO), which is not necessarily or strictly a data model, as he describes it. "It's a formal and factual definition of reality, done by subject matter experts and validated by the industry," says Atkin. FIBO will have three versions, for business entities, for instruments and for loans. Beyond ontology, however, says Atkin, semantic processing will be the future of data management, because it can bridge a gap between simple data dictionaries and ontologies.
The EDM Council worked with the standards body, Object Management Group, on FIBO, and has done a proof of concept for semantic processing using derivatives and FIBO, says Atkin. This effort will allow compliance with regulatory demands to be able to take in FpML, pull data from relational databases, analyze and classify data, and analyze links to counterparties, he explains. The EDM Council's semantic processing plans show promise for pulling together different data standards. The specifics of these plans include implementation in XML format and possibly metadata annotations for derivatives, as Atkin mentions.
The question that the positioning by Open Data Model and the EDM Council raises is which approach is the right way to go for data standards modeling? Is it emphasizing organization or classification by asset class? Or is it linking the proverbial alphabet soup of acronym-named formats together? I'd like to hear your thoughts on this. Please post replies here, or you may also respond via Inside Reference Data's LinkedIn discussion group, where this column will also appear.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Regulation
Regis-TR and the Emir Refit blame game
The reporting overhaul was been marred by problems at repositories, prompting calls to stagger future go-live dates.
FCA: Consolidated tape for UK equities won’t happen until 2028
At an event last week, the FCA proposed a new timeline for the CT, which received pushback from participants, according to sources.
Cusip Global Services wants to know, ‘What’s your damage?’
The evidence and discovery phase of the case against the identifier bureau is set to expire in March, bringing an anticipated jury trial one step closer.
Big questions linger as DORA compliance approaches
The major EU regulation will go live tomorrow. Outstanding clarifications and confusion around the definition of an ICT service, penetration testing, subcontracting, and more remain.
Insurance: The role of risktech in effectively managing emerging risks and driving competitive edge
This whitepaper covers the global survey, conducted by Chartis Research and TCS, of banking, financial services and insurance firms, which found that insurers are struggling to adapt to evolving risks and regulatory requirement increases. Chartis offers…
FX automation key to post-T+1 success, say custodians
Custody banks saw uptick in demand for automated FX execution to tackle T+1 challenges.
Observations and lessons to learn from the move to T+1
The next few years will see other jurisdictions around the world look to North America for guidance on transitioning to shorter settlement cycles.
Expanded oversight for tech or a rollback? 2025 set to be big for regulators
From GenAI oversight to DORA and the CAT to off-channel communication, the last 12 months set the stage for larger regulatory conversations in 2025.