Federal Government Prepares for Big Data

Government CIOs and their staffs are taking key steps to embrace the big data initiatives that are currently taking place in the private sector. The federal government of the US has many large data warehouses that continue to grow and push current technology to the limit. As a result they are taking steps both in house and through partnerships with private companies and whole industries to drive change. We recently reported on one such initiative with DARPA (the U.S. Defense Advanced Research Projects Agency). They have awarded $3 million to software provider Continuum Analytics to help fund the development of Python's data processing and visualization capabilities for big data jobs.

The money will go toward developing new techniques for data analysis and for visually portraying large, multi-dimensional data sets. The work aims to extend beyond the capabilities offered by the NumPy and SciPy Python libraries are widely used by programmers for mathematical and scientific calculations. The work is part of DARPA's XData research program, a four-year, $100 million effort to give the Defense Department and other U.S. government agencies tools to work with large amounts of sensor data and other forms of big data."

Federal Terabyte databases are growing into petabyte databases, pushing the processing and storage limits of the IT systems in place and testing the abilities of even the most experienced database managers.

A big data workshop held by the National Institute of Standards and Technology in January drew more than 800 attendees from federal agencies and the technology companies that work with them. IT leaders from the Department of Defense, the Department of Energy, NASA, the National Oceanic and Atmospheric Administration (NOAA), Veterans Affairs and the White House were among those who came to discuss the convergence of big data and cloud computing, big data life cycle management and big data analytics.

The Obama administration pushed big data up the federal IT priority list last March when it unveiled a formal research and development initiative aimed at developing new technologies for big data management and analysis. The goal is spurring breakthroughs in science and engineering, transforming education and strengthening national security. In a blog post titled "Big Data Is A Big Deal," Tom Kalil, deputy director for policy at the Office of Science and Technology Policy, called for an "all hands on deck" among government, businesses, universities and nonprofits.

To kick-start the effort, six federal agencies -- the Defense Advanced Research Projects Agency, the departments of Defense and Energy, National Institutes of Health (NIH), National Science Foundation (NSF), and U.S. Geological Survey (USGS) -- announced plans to invest $200 million collectively in big data R&D. A new interagency steering group is crafting a national R&D strategy, the components of which include foundational research, development of IT infrastructure that's "big data ready," education and workforce development, and collaboration.

Some agencies have begun to develop their own plans for big data research and management. The Pentagon will spend $250 million annually on big data ($60 million of which is included in the $200 million in new federal research). One area of investment is a DARPA program called XDATA to develop "computational techniques and software tools for sifting through large structured and unstructured data sets," according to a White House document on the federal initiative.

Enjoyed the article?

Sign-up for our free newsletter to kick off your day with the latest technology insights, or share the article with your friends and contacts on Facebook, Twitter or Google+ using the icons below.

James Finnan has been covering financial markets for 10 years. He has served as Editor in Chief of CFOZone.com since 2010. Additionally he has been a contributing writer to the My Media network of sites including CIOZone.com, myITview.com, and myCIOview.com.