Addressing Big Data One Byte At A Time

Strapline: Industry Warehouse

rene-keller-informationmosaic

Unlike other recent industry buzzwords that have trended but in actuality turned out to be “much ado about nothing,” the term “big data” represents important and real challenges to the technology and operations departments of financial firms. The capability to centrally maintain large volumes of data from disparate sources, and process that data into useful information will require a unique set of advanced tools. As the complexity, as well as the sheer volume, of the data grows, solutions will have to be dynamic and flexible in their design.

The first challenge is consolidating data—whether it is reference data, market data, corporate action data, or even pricing—from several sources. While financial firms are reducing the number of relationships they maintain with data providers, numerous feeds currently exist across the enterprise and pose a real problem today. In addition, there were actual buying decisions behind each of these relationships, and a dependence upon this information, somewhere in the enterprise, has been created. The reduction of feeds may therefore be slow to materialize.

The solution obviously requires scalable architecture. Perhaps more important is the need to deploy flexible mapping utilities that can find, and even create, common fields across different file layouts. A lack of a common industry standard, at present, means various files have to be cleansed and normalized against a defined format. The resultant layout will need to be adaptable to future changes. This requires a solution with a flexible data model.

Once a consolidated format is agreed, a presentation layer needs to be designed and implemented so the information can be of value to the end-user and client. As a result of recent market downturns, there is a growing demand throughout the industry, from regulators and investors alike, for quality data; “assurance” and “transparency” being two more important and very real terms. To address this, the solution requires sophisticated rules engines coupled with the automated mapping techniques mentioned above.

In conjunction with the presentation layer, there is an acknowledged demand in the marketplace for business intelligence tools to address big data. Detailed analysis is the next logical step in processing the data and turning it into useful information. Again, this requires a scalable and high-end processing application, as well as a sophisticated analytics solution. Besides the typical day-to-day requirements of any analytics solution, such as report and inquiry creation, a cutting-edge analytics engine should be able to generate output such as “what-if” scenarios and “in/out of the money” alerts. To make this functionality worthwhile, the solution needs to be online and in real-time. If the generation of new reports, inquiries, and alerts could be done by an end-user as opposed to having them coded by the IT department, valuable technical resources would be freed up and time to rollout would be drastically reduced.

Lastly, as a result of recent economic declines, a very real challenge comes from the need for global regulation. While all market jurisdictions generally agree on what is needed for increased transparency and systemic risk mitigation, the sources of this information are as segmented as the markets themselves. Therefore, the need for central processing and consolidation on a global basis is crucial. A pressing case-in-point is the introduction and global endorsement of the legal entity identifier (LEI), created to indicate the name, location, electronic address, and legal status of an organizational entity. The LEI in itself requires the flexible data model mentioned earlier, as well as modern mapping tools. Not only are financial firms required to implement the LEI in current data layouts, but they must also link to historical transactions generated pre-LEI. Data model flexibility and openness are instrumental to incorporating future global regulatory requirements in a timely and accurate way.

In closing, the challenges posed by big data are significant and exist right now. However, with the implementation of modern, robust solutions, not only can the challenges be met but opportunities for value-added features for firms and their clients can be created.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

The AI boom proves a boon for chief data officers

Voice of the CDO: As trading firms incorporate AI and large language models into their investment workflows, there’s a growing realization among firms that their data governance structures are riddled with holes. Enter the chief data officer.

If M&A picks up, who’s on the auction block?

Waters Wrap: With projections that mergers and acquisitions are geared to pick back up in 2025, Anthony reads the tea leaves of 25 of this year’s deals to predict which vendors might be most valuable.

Removal of Chevron spells t-r-o-u-b-l-e for the C-A-T

Citadel Securities and the American Securities Association are suing the SEC to limit the Consolidated Audit Trail, and their case may be aided by the removal of a key piece of the agency’s legislative power earlier this year.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here