Policy Vet Hart to Head Up Data Coalition
Nick Hart is named CEO of the Data Coalition and interim president of the Data Foundation.
![Congress](/sites/default/files/styles/landscape_750_463/public/2017-10/GettyImages-494340760.jpg.webp?itok=iGokobol)
Open data has a new champion.
Nick Hart has been named CEO of the Data Coalition, an open data trade association, and interim president of the Data Foundation, an industry-focused open data research organization.
![Nick Hart](/sites/default/files/styles/medium/public/2019-04/Nick_Hart_DC.jpg.webp?itok=S_5mP_Z8)
His experience mostly stems from work with the US federal government. He was director of the Bipartisan Policy Center’s Evidence Project, where he remains as a fellow, and previously was its policy and research director. His PhD is in public policy from The George Washington University, and he also holds advanced degrees from Indiana University Bloomington in environmental science and policy.
“Ensuring policies are effectively designed to encourage responsible data use offers great benefits for the American people,” Hart says.
In his new role, Hart will drive the Data Coalition’s policy agenda, which advocates for a government-wide open data policy, and open data for management, regulatory compliance, and laws and mandates. He will direct thought leadership, programming, and education with an aim to elucidate the value of open data for government and society.
“The Data Coalition has long advocated for modernizing the US financial regulatory reporting system so that reported data can be more useful,” Hart says. “This requires shifting reported information from unstructured documents into searchable, standardized, and machine-readable data.”
The Data Coalition hosts an annual RegTech Data summit, and when asked about how capital markets plays into the organization’s mission, Hart highlights the Coalition’s ongoing support of the Financial Transparency Act (HR 1530 in the 115th Congress), which he calls the first regtech legislative proposal in the US.
“The proposal would direct the eight major US financial regulatory agencies to collect and publish the information they collect from financial entities in an open data form, electronically searchable, downloadable in bulk, and without license restrictions. When government information is reported and published as data instead of documents, it reduces regulatory burdens on businesses, gives the public and investors better access to information, and boosts our ability to find, and prevent, instances of fraud,” he says.
The organizations are based in Washington, DC.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
SocGen pushes data, analytics use cases for SG Markets
The bank is letting a handful of clients experiment with its proprietary data and models to inform their research.
Ace high or busted flush? Digital Asset’s mixed fortunes mirror DLT adversity
The vendor hoped to remodel post-trade using blockchain technology—and it still might—but its bumpy progress raises questions over the future of DLT in finance.
AI could cut time for money laundering checks by 99%
Leading crypto exchange rolling out large language model for enhanced due diligence checks.
Standard Chartered keeps faith with quantum experimentation
The bank is aiming to future-proof itself with the ability to adopt new technology at an early stage.
Waters Wrap: CME, Google and the pursuit of ultra-low-latency trading
CME Group and Google have announced Aurora, Illinois, as the location for the exchange’s new co-location facility. Anthony explains why this is more than just the next phase of the two companies’ originally announced project.
This Week: Genesis/Interop.io; S&P Global; Finos/OS-Climate and more
A summary of the latest financial technology news.
GenAI: US Fed reveals its five use cases
Internal sandbox used to assess viability and risks; coding and content generation on the agenda.
Natixis refines in-house interoperability model
The French asset manager has refined its canonical data model over the last decade, as the interoperability movement continues to evolve.