Agencies are facing a perfect storm of cloud computing and mobility. And it’s causing departments to struggle in managing all the data that is coming from it. But help is on the way.

The National Institute of Standards and Technology is leading a new effort to help integrate the cloud and mobile with big data.

NIST, which is holding a three-day workshop in Gaithersburg, Md. this week, is asking agency, private sector and academic experts for their help in tackling the challenges that come with integrating big data and cloud services. Patrick Gallagher, director, National Institute of Standards and Technology “Like cloud, big data is going to change everything. We are really looking at a new paradigm, a place of data primacy, where everything starts with the consideration of the data rather than consideration of the technology,” said Patrick Gallagher, director of NIST. “This is a real shift from the way we’ve historically thought about this. Given the technology, we could talk about a problem to give this data. We could and we’ve moved to designing these technologies around the data and in some cases defining the problems around the data.”

As agencies have moved toward the cloud and through a more widespread use of smartphones and tablet computers, employees have found the amount of data available has increased significantly.

Departments also want to make better use of the information to improve their decision-making across mission areas. But the question quickly turns to for many, how do they deal with the data tsunami?

Advertisement

“I think that this progression now leads very naturally both to Steve [VanRoekel’s] governmentwide strategy for the federal government — it also has broadened — but in particular it raises the notion one of the critical intersections between cloud computing and big data,” Gallagher said. “Cloud is a multiplier when it’s combined with other technologies, and it frees the users from the constraints of location and catalyzes data driven inspiration for discovery.”

The workshop’s goal, in part, is to answer some basic questions:

How do you define “big data?”

What new things happen at the intersection between cloud and big data?

What are the things that shape big data and cloud going forward?

What are the challenges?

What will break?

What new opportunities will open?

“This is a rare opportunity to bring together what are often two very different communities to look at the intersection of these two worlds,” he said. Steven VanRoekel, federal chief information officer NIST and other agencies already are tackling the concept of big data and the cloud. Gallagher said the research agency is working on high volume DNA sequencing data and looking at how data in the cloud could support the discovery of and the accelerated use of manufacturing materials.

Steven VanRoekel, the federal chief information officer, said the government holds so much data that unlocking it, like what the Defense Department did with Global Position Systems data or what the National Weather Service did with weather data, could have a huge impact on the lives of citizens.

“We’re right now starting to build policy and rules, and starting to change hearts and minds on the value of opening that data and looking at what we could do,” he said. “We’ve been holding and working with agencies to aggressively unlock data.”

A big data strategy?

VanRoekel didn’t comment on whether this intersection of cloud and data will require new strategies.

But others at the conference said if you look at the roadmap the Office of Management and Budget has been asking agencies to create, over the last four years, a cloud strategy to go with the cloud first policy and a mobile strategy to go with the mobile expansion. Asking agencies to create a big data strategy is not only possible, but likely.

VanRoekel said the goal is not just to have the data, but to know what you have and understand its value to the agency’s mission.

“We need your help in really thinking about the multiplier effect of this work,” he said. “The combination of cloud and big data not only can create useful insights, but also can create incredible value both on the value we provide downstream like the weather and GPS examples of public safety, economic benefits and other things, but real value in the way we think about decision making, the way we think about policy in the government, the way we think about creating mission-based systems, and I think there is a definite multiplier effect to be had at that intersection.”

Pete Tseronis, the Energy Department’s chief technology officer, said from his agency’s perspective it makes sense to have a big data strategy, though he didn’t have any knowledge of what OMB’s actual plans are for requiring agencies to create such a document.

“Do we have a data scientist now? No. We have a lot of them in the labs. Do we have some one individual who’s looking at all the data sets, and compiling them and making sense of them? No. But it’s logical that working with our enterprise architects, our mission owners to get our hands around the information could be a follow-on to the digital strategy,” he said.

Value of data is hard to grasp

Tseronis added the digital strategy requires agencies to make all data machine-readable and make it accessible on mobile and public platforms.

He said it’s up to the agencies to work together to creating big data strategies.

Energy is one of several agencies trying to address its big data challenges.

Tseronis said Energy is trying to better understand the value of the data, not just that it has lots of data. Pete Tseronis, chief technology officer, Energy Department “It starts with understanding what the mission is, what the intent of that funding that goes to support that program office and the data that is being created, how does it add value to the energy sector? Knowing that mission to how does IT enable it, so the value extracted on the surface is logical. The smart grid is doing research into improving the antiquated grid that we have today. Well, is there more value we can extract if we were exposed to more information? Or can we make sense of the data we are pulling and extracting from the grid itself. And that’s where you get into leveraging tools that do a lot of that analysis.”

Tseronis said Energy is in the formulation stage of this concept, but it’s part of the evolution of cloud, mobile and data. He said Energy needs to figure out which tools and processes are best to analyze and increase the data’s impact.

Energy’s experience with cloud as a platform has been a good especially around high performance computing. Tseronis said the Magellan program, which went from August 2009 to December 2011, showed how the agency would benefit from using the cloud for high performance data manipulation needs.

“If those supercomputers are busy or if there is a logjam of folks in the queue, then the cloud, whether in a public or private mode, is now an option for the Department of Energy if the computers themselves are not available,” he said. “It gives us an option that wasn’t available in the past.”

Tom Temin is the host of The Federal Drive, which airs from 6-9 a.m. on 1500 AM in the Washington, DC region and online everywhere. Tom has 30 years experience in journalism, mostly in technology markets. Before coming to Federal News Radio, he was a long-serving editor-in-chief of Government Computer News and Washington Technology magazines.