Michael Shashoua: Simplicity and Security

The substance and emphasis of data management operations have evolved over the past few years. As the industry heads into the fourth quarter of the year, aiming for greater simplicity in data collection and production of a “golden copy” is at the top of executives’ minds, even as the data becomes ever more complex, and requires greater effort to keep it secure.
A host of data issues, including those relating to the golden copy, all come back to the data supply chain. Any consideration of whether federated or consolidated data models are better for a firm is affected by how data is sourced. Where it used to be possible to obtain golden copy data from a single, reliable source, now golden copy is typically an amalgamation of multiple sources.
So, again, the data supply chain must be considered. The term “data supply chain” itself is often thrown around casually, without clear definition, so it may mean one thing to a professional using it and another to the colleague who hears it. The term evokes ideas of several data suppliers being used, but that isn’t really it. As John Bottega, the ex-chief data officer who is now a senior advisor and consultant at the EDM Council, says, the steps in the data supply chain really are acquisition, process cleansing, maintenance, distribution and consumption.
In thinking about the data supply chain this way, the processing steps that data undergoes will affect how it can be handled in a federated or a consolidated fashion, as well as how multiple sources of data can be tied together and distributed accurately, most likely internally. Differences in data processing or sourcing from inconsistent supply chains can produce discrepancies right from the start, while putting flawed data into the cloud to feed big data resources can undermine such aspirational solutions.
Governance Considerations
Similarly, when considering data governance strategies, firms must consider how data is handled, whether by service providers or by centralizing data according to a data governance strategy. At Inside Reference Data’s European Financial Information Summit last month, Jacob Gertel, SIX Financial Information’s senior project manager for legal and compliance, considered the importance of data governance in the current regulatory climate.
Considering how to make data available to users becomes more complicated in the wake of new rules aimed at increasing security.
To comply with US Fatca tax withholding and reporting law and work with the Common Reporting Standard used for Fatca reporting, the data that financial intermediaries deliver must be based on data files from their customers, Gertel said. As a result, this data, with its relevance for regulatory compliance, has greater value than it might otherwise have. And, Gertel said, SIX seeks ways to make the data available to users without having to set up new management and distribution structures.
Complications
Considering how to make data available to users becomes more complicated in the wake of new rules aimed at increasing security, such as the European Union’s Cybersecurity Strategy and the European Commission’s Directive on Network and Information Security. Along with these, European regulators want firms to demonstrate that they have appropriate systems and structures in place for effective data protection.
The directives also require firms to know what their IT partners or vendors are doing about data security. “We all understand that there are speed-to-market and speed-of-compliance challenges from regulators, and global regulations are becoming increasingly strict,” explains Dan Crisp, managing director, EMEA, information risk management at BNY Mellon.
The data supply chain, especially its processing component, is a layer that underpins data governance strategies. When firms draft a governance plan to handle data, that must be done via data supply chains. Data format standards like the CRS for Fatca, and security directives for protecting data, like the EU Cybersecurity Strategy, will in turn shape these governance plans. No single part of data management changes by itself in a vacuum.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
BlueMatrix acquires FactSet’s RMS Partners platform
This is the third acquisition BlueMatrix has made this year.
Waters Wavelength Ep. 331: Cresting Wave’s Bill Murphy
Bill Murphy, Blackstone’s former CTO, joins to discuss that much-discussed MIT study on AI projects failing and factors executives should consider as the technology continues to evolves.
FactSet adds MarketAxess CP+ data, LSEG files dismissal, BNY’s new AI lab, and more
The Waters Cooler: Synthetic data for LLM training, Dora confusion, GenAI’s ‘blind spots,’ and our 9/11 remembrance in this week’s news roundup.
Chief investment officers persist with GenAI tools despite ‘blind spots’
Trading heads from JP Morgan, UBS, and M&G Investments explained why their firms were bullish on GenAI, even as “replicability and reproducibility” challenges persist.
Wall Street hesitates on synthetic data as AI push gathers steam
Deutsche Bank and JP Morgan have differing opinions on the use of synthetic data to train LLMs.
A Q&A with H2O’s tech chief on reducing GenAI noise
Timothée Consigny says the key to GenAI experimentation rests in leveraging the expertise of portfolio managers “to curate smaller and more relevant datasets.”
Etrading wins UK bond tape, R3 debuts new lab, TNS buys Radianz, and more
The Waters Cooler: The Swiss release an LLM, overnight trading strays further from reach, and the private markets frenzy continues in this week’s news roundup.
AI fails for many reasons but succeeds for few
Firms hoping to achieve ROI on their AI efforts must focus on data, partnerships, and scale—but a fundamental roadblock remains.