Addressing Big Data One Byte At A Time
Strapline: Industry Warehouse
Unlike other recent industry buzzwords that have trended but in actuality turned out to be “much ado about nothing,” the term “big data” represents important and real challenges to the technology and operations departments of financial firms. The capability to centrally maintain large volumes of data from disparate sources, and process that data into useful information will require a unique set of advanced tools. As the complexity, as well as the sheer volume, of the data grows, solutions will have to be dynamic and flexible in their design.
The first challenge is consolidating data—whether it is reference data, market data, corporate action data, or even pricing—from several sources. While financial firms are reducing the number of relationships they maintain with data providers, numerous feeds currently exist across the enterprise and pose a real problem today. In addition, there were actual buying decisions behind each of these relationships, and a dependence upon this information, somewhere in the enterprise, has been created. The reduction of feeds may therefore be slow to materialize.
The solution obviously requires scalable architecture. Perhaps more important is the need to deploy flexible mapping utilities that can find, and even create, common fields across different file layouts. A lack of a common industry standard, at present, means various files have to be cleansed and normalized against a defined format. The resultant layout will need to be adaptable to future changes. This requires a solution with a flexible data model.
Once a consolidated format is agreed, a presentation layer needs to be designed and implemented so the information can be of value to the end-user and client. As a result of recent market downturns, there is a growing demand throughout the industry, from regulators and investors alike, for quality data; “assurance” and “transparency” being two more important and very real terms. To address this, the solution requires sophisticated rules engines coupled with the automated mapping techniques mentioned above.
In conjunction with the presentation layer, there is an acknowledged demand in the marketplace for business intelligence tools to address big data. Detailed analysis is the next logical step in processing the data and turning it into useful information. Again, this requires a scalable and high-end processing application, as well as a sophisticated analytics solution. Besides the typical day-to-day requirements of any analytics solution, such as report and inquiry creation, a cutting-edge analytics engine should be able to generate output such as “what-if” scenarios and “in/out of the money” alerts. To make this functionality worthwhile, the solution needs to be online and in real-time. If the generation of new reports, inquiries, and alerts could be done by an end-user as opposed to having them coded by the IT department, valuable technical resources would be freed up and time to rollout would be drastically reduced.
Lastly, as a result of recent economic declines, a very real challenge comes from the need for global regulation. While all market jurisdictions generally agree on what is needed for increased transparency and systemic risk mitigation, the sources of this information are as segmented as the markets themselves. Therefore, the need for central processing and consolidation on a global basis is crucial. A pressing case-in-point is the introduction and global endorsement of the legal entity identifier (LEI), created to indicate the name, location, electronic address, and legal status of an organizational entity. The LEI in itself requires the flexible data model mentioned earlier, as well as modern mapping tools. Not only are financial firms required to implement the LEI in current data layouts, but they must also link to historical transactions generated pre-LEI. Data model flexibility and openness are instrumental to incorporating future global regulatory requirements in a timely and accurate way.
In closing, the challenges posed by big data are significant and exist right now. However, with the implementation of modern, robust solutions, not only can the challenges be met but opportunities for value-added features for firms and their clients can be created.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
LSEG partners with Citi, DTCC goes on-chain, AI on the brain, and more
The Waters Cooler: Trading Technologies buys OpenGamma, CT Plan updates, and the beginning of benchmarking in this week’s news roundup.
AI & data enablement: A looming reality or pipe dream?
Waters Wrap: The promise of AI and agents is massive, and real-world success stories are trickling out. But Anthony notes that firms still need to be hyper-focused on getting the data foundation correct before adding layers.
Data managers worry lack of funding, staffing will hinder AI ambitions
Nearly two-thirds of respondents to WatersTechnology’s data benchmark survey rated the pressure they’re receiving from senior executives and the board as very high. But is the money flowing for talent and data management?
Data standardization is the ‘trust accelerator’ for broader AI adoption
In this guest column, data product managers at Fitch Solutions explain AI’s impact on credit and investment risk management.
As AI pressures mount, banks split on how to handle staffing
Benchmarking: Over the next 12 months, almost a third of G-Sib respondents said they plan to decrease headcount in their data function.
Everyone wants to tokenize the assets. What about the data?
The IMD Wrap: With exchanges moving market data on-chain, Wei-Shen believes there’s a need to standardize licensing agreements.
FIX Trading Community recommends data practices for European CTs
The industry association has published practices and workflows using FIX messaging standards for the upcoming EU consolidated tapes.
TCB Data-Broadhead pairing highlights challenges of market data management
Waters Wrap: The vendors are hoping that blending TCB’s reporting infrastructure with Broadhead’s DLT-backed digital contract and auditing engine will be the cure for data rights management.