Navy struggles to find the way ahead on big data

Jared Serbu, reporter, Federal News Radio

For the Navy, moving to a net-centric model of operations was a good first step, but not enough. The service is in the early throes of thinking about how to transform its IT operations in such a way that they're focused on the data itself and let Navy leaders make better decisions.

The march toward data-centric thinking started in 2009, when the Navy made the decision to undergo a significant reorganization at the top of its decision-making food chain, combining the Director of Naval Intelligence (N2) and the service's IT functions (N6) into a single directorate, now known as the Deputy Chief of Naval Operations for Information Dominance (N2/N6).

One of the Navy's first steps toward the information dominance idea was to start thinking of all of its ships, airplanes and other platforms as data-collecting sensors.

"So we've gone from netcentricity, to every platform as a sensor, to netting every sensor, and the point of all of it really is not just to have netcentricity but to enable good decisions based on the data that comes off of those sensors," said Rear Adm. Jan Tighe.

New organizational construct

Tighe is part of the relatively new Information Dominance construct established by former Chief of Naval Operations Adm. Gary Roughead. And her title reflects as much ambition as the refashioned N2/N6 organization does: She's the Navy's Director for Decision Superiority.

And Tighe told an AFCEA Northern Virginia audience Friday the Navy recognizes having the data is one thing, but having it in an environment that makes it easily available to the people and systems who need it is something else entirely. And the Navy's a ways off from that goal.

Rear Adm. Jan Tighe, director, Decision Superiority, U.S. Navy

"You can go to almost any mission area or functional area in the United States Navy and think about how much of our human capital gets spent diving into various databases and then manually aggregating it," she said. "If we could get to a place where our data is in a cloud that's understandable, that's smartly tagged, that's discoverable, we could easily get to solutions that don't require so much human intervention. That lets the humans deal with the higher-order thinking that's required to make those decisions."

That's the next step, Tighe said, and it's a lot easier said than done. The Navy's IT architecture isn't poised to free the data from the systems that house it, and neither are its policies.

"What we're thinking is along the lines of a big data-type approach," she said. "We need to ingest data in a way that lets us understand it well and share it across a number of different mission areas. We need to get to a more standardized way of dealing with data, and particularly big data."

Tighe said the big data issue isn't really a technological problem, as Amazon and Google have proven with their own decentralized cloud based information systems. Rather, it's a policy problem the Navy, and DoD as a whole, need to overcome by rethinking things such as identity management, and collapsing networks that don't interoperate with one another.

Following the IC's lead

The good news, she said, is a lot of that hard thinking already has been done by the government's intelligence agencies. The Navy's hoping to copy at least some of it.

"The intelligence community has been forced down this road as a result of the events of 9/11, and they are to some extent leading the charge on how to deal with handling big data," she said. "The policy changes we need to handle with DoD data have a lot of parallels with what the intelligence community has already done, and we want to leverage that. What we're talking about is making organic Navy data discoverable for all the applications that might need it, so that we're not duplicating data in all these various places."

The vision is to make that data tagged in such a way that's it's accessible to not just the Navy, but other DoD components and even members of multinational coalitions, and in such a way that it lets the military easily share data across traditional boundaries, across agency cultures and across classification levels.
But for now, Tighe said, that new network and policy architecture is just a vision. The Navy is expected, any day now, to formally ask for industry proposals to take on the operations of most of its IT network under the NGEN contract. And for at least the short term, that network is going to look an awfully lot like it does today.

"We don't have something we can hand the acquisition community and tell them, 'we want it to look like this,'" she said. "The NGEN RFP is going to go forward as planned. We will then look for what our plan is to migrate some of these new ideas into it. Our challenge is going to be figuring out the cost of migrating in this way, and also the schedule. What are the contractual limits that we're already in? What are the offramps that might let us bring some of this new architecture? We're trying to look at all of that across all of our programs, and we're looking at it in conjunction with data center consolidation."

The Navy's leadership also wants its future IT systems to take advantage of the cost saving and security benefits of virtual desktops, replacing what Tighe said are frequently three or four physical computers beneath an individual Navy users' desk.

At the same time, she said it's important that the Navy and DoD move to virtualized and cloud environments in a coordinated fashion.

"It is my belief that many of the folks in the Navy with big data requirements are headed in this direction on their own," she said. "What we can't have is to have 400 instantiations of different clouds out there supporting the Navy. Then we can't work across those clouds. The application has to understand how the data is formatted, and if everybody does it a little differently, that's not going to work. So we're trying to get ahead of that as best we can without spooking the herd."