Data Standards Competition Heats Up
Last week saw a great deal of activity in the data standards model field, or at least a lot of publicity from competing models. Open Data Model, a newer organization than the well-established EDM Council, set out classifications based on asset class distinctions as the basis for its model. Open Data's model is based on the ISO 10962 standard and the same organizing principles as Wikipedia, according to Rodger Nixon, chief executive and founder of Open Data Model.
Open Data Model's membership steadily grew in the latter part of 2011, according to Nixon, who contends that his organization's offering and methods are better and more effective than those of the EDM Council. "There are standardizations of the messages and transactions, but our field isn't transactions, it's reference data," he says. "It's not a relational model."
Multiple classifications provide a basis for better dimensional models for data analytics, states Nixon. "You can see the instruments underlying the derivatives," he says. "You're trying to see the underlying instruments in derivatives and ‘explode' them out [for analysis]. What's your exposure? What are the simplest components, the ones that have an effect?"
Meanwhile, in a webcast by EDM Council, managing director Mike Atkin emphasized the uses of its Financial Industry Business Ontology (FIBO), which is not necessarily or strictly a data model, as he describes it. "It's a formal and factual definition of reality, done by subject matter experts and validated by the industry," says Atkin. FIBO will have three versions, for business entities, for instruments and for loans. Beyond ontology, however, says Atkin, semantic processing will be the future of data management, because it can bridge a gap between simple data dictionaries and ontologies.
The EDM Council worked with the standards body, Object Management Group, on FIBO, and has done a proof of concept for semantic processing using derivatives and FIBO, says Atkin. This effort will allow compliance with regulatory demands to be able to take in FpML, pull data from relational databases, analyze and classify data, and analyze links to counterparties, he explains. The EDM Council's semantic processing plans show promise for pulling together different data standards. The specifics of these plans include implementation in XML format and possibly metadata annotations for derivatives, as Atkin mentions.
The question that the positioning by Open Data Model and the EDM Council raises is which approach is the right way to go for data standards modeling? Is it emphasizing organization or classification by asset class? Or is it linking the proverbial alphabet soup of acronym-named formats together? I'd like to hear your thoughts on this. Please post replies here, or you may also respond via Inside Reference Data's LinkedIn discussion group, where this column will also appear.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Regulation
Waters Wavelength Ep. 342: LexisNexis Risk Solutions’ Sophie Lagouanelle
This week, Sophie Lagouanelle, chief product officer for financial crime compliance at LNRS, joins the podcast to discuss trends in the space moving into 2026.
Citadel Securities, BlackRock, Nasdaq mull tokenized equities’ impact on regulations
An SEC panel of broker-dealers, market-makers and crypto specialists debated the ramifications of a future with tokenized equities.
FIX Trading Community recommends data practices for European CTs
The industry association has published practices and workflows using FIX messaging standards for the upcoming EU consolidated tapes.
Interview: Linda Middleditch, Regnology
Regnology’s Linda Middleditch discusses its acquisition of Wolters Kluwer’s FRR business
Tokenized assets draw interest, but regulation lags behind
Regulators around the globe are showing increased interest in tokenization, but concretely identifying and implementing guardrails and ground rules for tokenized products has remained slow.
Waters Wavelength Ep. 341: Citi’s Pitts and Topa
This week, Citi’s Michele Pitts and Marcello Topa join Wei-Shen to talk about UK and EU T+1.
Why source code access is critical to DORA compliance
As DORA takes hold in EU, Adaptive’s Kevin Covington says that it is shining a light on the criticality of having access to source code.
Nasdaq’s blockchain proposal to SEC gets mixed reviews from peers
Public comment letters and interviews reveal that despite fervor for tokenization, industry stakeholders disagree on its value proposition.