Federal Agencies Get Guidance On New Data-Sharing Mandate

By Alex Woodie

August 23, 2013

Cancer statistics. Medicare fraud. Peanut recalls. Federal government agencies collect data on all types of phenomenon, and just about every piece of data is useful to somebody, somewhere, at some time. More of that data will find its way into the hands of the public and entrepreneurs thanks to the government’s new data sharing mandate, and this month, the government issued new guidelines telling federal agencies how to comply with the mandate.

In May, the White House issued an executive that requires government agencies to share data, with the idea that data is a national asset and will help fuel innovation, economic growth, and government efficiency. At the same time, the government adopted a new Open Data Policy that required all newly generated government data to be shared with the public, in machine-readable format, such as CSV or TEXT files.

This month, the White House’s Office of Science and Technology Policy provided guidance to help the agencies do exactly that. The assistance from the OSTP takes several forms, including: directions on how agencies can inventory and publish data; a set of FAQs on how the new policy affects the federal acquisition and grant-making process; a framework for creating measurable goals that agencies can use to track their progress; and a set of free tools, case studies, and other resources that agencies can download from the Project Open Data website at http://project-open-data.github.io.

According to the OSTP, federal agencies must, by November 13, have taken the following actions: create and maintain an “enterprise data inventory;” create and maintain a “public data listing;”

create a process to engage with customers to help facilitate and prioritize data release; document if data cannot be released; and clarify roles and responsibilities for promoting efficient and effective data release.

The new Open Data Policy will benefit Americans and the economy, explained two government officials–Nick Sinai, U.S. Deputy CTO of the OSTP, and Dominic Sale, Supervisory Policy Analyst of the Office of Management and Budget–in a letter posted this month to the OSTP website.

“Opening up a wide range of government data means more entrepreneurs and companies using those data to create tools that help Americans find the right health care provider, identify a college that provides good value, find a safe place to live, and much more,” they wrote. “It also empowers decision makers within government, giving them access to more information to enable smarter, data-driven decisions.”

The government has shared more than 63,000 datasets in the last 12 months, including more than 30,000 from the Department of Commerce and more than 20,000 from the Department of the Interior alone, according to data.gov, a clearinghouse of government data. The government is putting together a new, cleaner looking and easier to use data-sharing website, which you can preview at next.data.gov.

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate scientists the ability to use machine learning to identify e Read more…

By Rob Farber

Activist investor Starboard Value has been exerting pressure on Mellanox Technologies to increase its returns. In response, the high-performance networking company on Monday, March 12, published a letter to shareholders outlining its proposal for a May 2018 extraordinary general meeting (EGM) of shareholders and highlighting its long-term growth strategy and focus on operating margin improvement. Read more…

By Staff

Quantum is coming. Maybe not today, maybe not tomorrow, but soon enough. Within 10 to 12 years, we’re told, special-purpose quantum systems will enter the commercial realm. Assuming this happens, we can also assume that quantum will, over extended time, become increasingly general purpose as it delivers mind-blowing power. Read more…

By Doug Black

HPE Extreme Performance Solutions

High Performance Computing (HPC) is unlocking a new era of speed and productivity to fuel business transformation. Rapid advancements in HPC capabilities are helping organizations operate faster and more effectively than ever, but in today’s fast-paced marketplace, a new generation of technologies is required to reach greater scalability and cost-efficiency.Read more…

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise IT in its willingness to outsource computational power. The m Read more…

By Tiffany Trader

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Rosemary Francis

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…