The Next Big Data Debate Emerges
The ongoing discussion about big data, which continued last week in Waters' Big Data Webcast, appears to be turning away from a debate between using cloud computing or the Hadoop standard to a concern with rapidly increasing volume and velocity of data creating a need for greater use of big data systems.
An unspoken context underlying the webcast discussion, which had participants from Credit Suisse, BNY Mellon, Intel, IBM's Platform Computing and Sybase, is that the industry already seems to be leaning or moving away from Hadoop and toward cloud as being more effective for handling big data.
"The cost per gigabyte of storing that transaction over time is pushing us into cheaper, non-SQL, big data-type solutions," said Ed Dabagian-Paul, a vice president at Credit Suisse who works on setting strategy and direction for technology infrastructure at the firm. "The traditional big data solutions haven't mapped to our problems. We can answer most of our existing problems with existing data analytics or very large databases."
Daryan Dehghanpisheh, global director of the financial services segment at Intel, identified "volume, variety, value and velocity" as the four pillars of big data. He had already noted volume, and processing speed and time as key areas for big data when speaking with us in November.
Intel works with partners to produce solutions for operational issues such as big data. According to Dehghanpisheh, the company aims to achieve complex machine learning, statistical modeling and graphing of algorithms within big data, rather than the traditional business intelligence of query reporting and examining historical data trends. Orchestrating use of metadata and setting data usage policies are important parts of administering big data operations, he adds.
An extensible framework is needed to manage the volume and velocity at which big data now pours forth, as Dennis Smith, managing director of the advanced engineering group at BNY Mellon, sees it. "There are tremendous cost benefits to this from a scale standpoint and particularly looking at volume use cases," he said. Cloud computing inherently offers greater scale, of course, and analytics can be layered onto it or attached to it. As Smith also explains, Hadoop-related technologies, or standalone analytics infrastructures and traditional data warehouses as staging areas may all be ways to manage big data in tandem with cloud resources.
The question to ask, or the discussion to have, now, is how to marry big data, sourced from or processed through the cloud, with analytical systems that can derive actionable meaning from the data, for all its increased volume and velocity.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Nasdaq to market new options strike listing tech to other exchanges
The exchange operator is experimenting with emerging technologies to determine which options strike prices belong in a crowded market, with hopes to sell the tech to its peers.
Waters Wavelength Podcast: Bloomberg’s Tony McManus
Tony McManus, global head of enterprise data division at Bloomberg, joins the podcast to talk about the importance of data in the context of AI and GenAI.
Putting the ‘A’ in CDO: The rise of the chief data and analytics officer
As data and analytics become more intertwined, banks and vendors are creating a new role—the chief data and analytics officer—to help them take advantage of the opportunities it presents. It may sound easy, but rethinking data can be a gargantuan task.
The IMD Wrap: Talk about ‘live’ data, NAFIS 2024 is here
This year’s North American Financial Information Summit takes place this week, with an expanded agenda. Max highlights some of the must-attend sessions and new topics. But first, a history lesson...
Waters Wavelength Podcast: S&P’s CTO on AI, data, and the future of datacenters
Frank Tarsillo, CTO at S&P Global Market Intelligence, joins the podcast to discuss the firm’s approach to AI, the importance of data, and what might be in store for datacenters in the coming years.
Breaking out of the cells: banks’ long goodbye to spreadsheets
Dealers are cutting back on their use of Excel amid tighter regulation and risk concerns.
BMO’s cloud migration strategy eases AI adoption
The Canadian bank is embracing a more digital future as its cloud strategy makes gains and it looks to both traditional machine learning and generative AI for further augmentation.
Waters Wrap: GenAI and rising tides
As banks, asset managers, and vendors ratchet up generative AI experiments and rollouts, Anthony explains why collaboration between business and tech teams is crucial.
Most read
- Waters Wavelength Podcast: Bloomberg’s Tony McManus
- IMD & IRD Awards 2024: All the winners
- Waters Wavelength Podcast: S&P’s CTO on AI, data, and the future of datacenters