Organisations in the public and private sectors need to be prepared to address some of the key challenges related to networks, security, and legacy workload virtualisation when moving to the cloud, explained Dawn Leaf, chief information officer at the US Department of Labour.

“In and around 2011, the office of management and budget said we were spending $82bn a year across all agencies on information and technology. About 80 per cent of that was on infrastructure, 80 per cent of that was on operations and maintenance,” she explained. “So these were some of the main drivers for our datacentre consolidation and cloud migration efforts.”

The US Department of Labour is a public sector organisation tasked with providing a range of welfare services to wage earners in America. Leaf, who currently heads up the IT strategy for the department and previously worked with NIST on defining cloud computing standards told an audience at Cloud Expo in London this week that her department struggled in a number of areas when it decided to make the move to the cloud a few years ago.

The first move it made was to migrate 1,900 staff in over 500 locations from legacy Exchange servers over to Office 365 for email. It’s also in the process of consolidating its datacenters and slimming the number of vendors in contracts with.

“We see reductions in physical space costs, and operational costs. If you have security products, two different security products, you need to support that with two people. By the time they’re fully burdened they’ll cost the taxpayer $200,000 a year each. So just by moving to one product we’re saving money.”

“One of our greatest challenges was something not on the standard tech roadmap provided by NIST. It was getting ready as an agency ourselves to connect to cloud services. That means ensuring we had the network and security infrastructure in place to access them,” she said. “We had nine different network infrastructures that weren’t standardised, and we found 150 inconsistencies in our networks which had to be resolved before we could even connect one of our departments.”

Leaf also said the department had to significantly boost bandwidth and upgrade employees’ desktops – the department still had about 10,000 people running Windows XP. It also suffered interoperability issues, problems which at the time the vendor in question, Microsoft, refused to take responsibility for.

“When connecting we had a 30 day lag time, and Microsoft naturally assumed that the problem was the telecoms organisation or our department. To their credit, they eventually identified it, told us about it, and fixed it. That was a challenge.

“We had issues not just with cloud but hand-held devices, and the assumption in this case was again that it was BlackBerry’s fault. But again the problem wasn’t blackberry, it was Microsoft.”

Leaf admitted that government organisations have particular constraints impacting how quickly the Department of Labour among others move some of their systems to the cloud, particularly in terms of the security. In the US, the Federal Information Security Management Act of 2002 sets out the number of procedures and requirements related to information governance depending on risk level. Data classified on the low end of risky requires close to 280 unique security controls, while the highest-risk data requires over 1,000. Cloud vendors, via FedRAMP, only need to demonstrate they can satisfy these requirements once, but there are still related information governance principles that must be implemented at the departmental level.

“Sharepoint is a challenge, we have well over 100 individual instances, and what we found is having policy for your enterprise Sharepoint is something that you should address. If you just migrate before you do this it’s going to be a governance nightmare.”

But she explained one of the department’s greatest threats is inside threats, adding the department “receives and screens thousands weekly if not daily.”

“There isn’t a difference in terms of security with cloud, just new procedures and technical controls. Once you’re over those thresholds, high-security high-risk data, it’s not less secure per se, but the costs that you as a vendor would incur to secure that environment jump so high that it would be better to use a private cloud approach or federated cloud approach to satisfy those requirements.”

She also told BCN that the biggest challenge is, perhaps unsurprisingly, managing and moving the massive legacy estate the department runs. Complicating the typical issues around legacy re-architecure and migration, many of the department’s digital services are widely used – at the lowest end ten to twenty thousand users.

“The legacy workloads aren’t ready to go for cloud, they weren’t designed that way. And we’re really planning on the redesign process, which we’ll time with technology refreshes,” she said. “But because we have so many users using these services it’s very challenging to keep them up which redeveloping them. We’ll move the data but we’re focusing more on greenfielding where we can.”

Jonathan BrandonJonathan Brandon is editor of Business Cloud News where he covers anything and everything cloud. Follow him on Twitter at @jonathanbrandon.