Can the Cloud Clear the Mission-Critical Hurdle?

The cloud wants enterprise data, and so far it has been fairly adept at gathering the low-hanging fruit: mostly bulk storage, archives, B&R, low-level database workloads and other non-critical stuff.

But the real money is in the advanced applications – the kind of data that organizations will pay a premium to support because it brings the highest value to emerging business models. This is a conundrum, however, because that high value also causes the enterprise to keep critical data close to the vest, which means cloud providers need to go the extra mile to win enterprise trust. And for the most part, that has not happened yet.

This is a shame because in terms of both security and uptime, the cloud is at least on par with the typical enterprise and in certain key metrics is actually superior. Cloud tracking site cloudharmony.com offers service status data for many of the top cloud providers going back at least a year, and its latest chart shows many services delivering four- or even five-nines availability. That puts outages at providers like Amazon EC2 and Google Cloud Service at mere minutes per year, while even three-nines performers confine their downtime to a few hours at most. A perfect record? Not by a longshot, but certainly no worse than the vast majority of enterprises out there.

So aside from continuously maintaining top performance and state-of-the-art infrastructure, what can the cloud industry do to get a little mission-critical loving from the enterprise? According to Lee Field, head of IT consulting and complex solutions at Verizon, the challenge is much greater when it comes to drawing established enterprises than start-ups, many of which are coming out with cloud-native strategies right from the start. But rather than focus on infrastructure and technology to woo enterprise clients, providers should make sure that their service catalogues are up to par and that the critical nature of each app is not only understood, but categorized and mapped properly. And top-notch migration capabilities wouldn’t hurt either.

Until cloud computing gains serious momentum (we are still in the very early stages, after all), fear, uncertainty and doubt (FUD) will remain high, says Global Financial’s Sean Catlin. But understanding what drives FUD will allow providers to break it down more quickly, and in this case the chief enemy is the unknown. Even highly educated people fear what they don’t understand, so removing the complexity of cloud services will go a long way toward improving trust. Many providers are even going so far as to lend their own cloud-expertise to prospective clients to help them not only see the benefits of cloud computing but to shore up client-side infrastructure to make it more compatible with hosted solutions. As well, the longer the cloud industry as a whole can go without a high-profile outage, the better.

Of course, it would be nice to know from established enterprises what they require in order to push more of the mission-critical load to the cloud. But while everyone’s needs are different, the largest enterprise in the world is at least taking some steps to define its requirements. This is coming in the form of the Federal Risk and Authorization Management Program (FedRamp), a National Institute of Technology (NIST) effort to establish baselines for what it calls “high/high/high” levels of confidentiality, integrity and availability. Aimed primarily at law enforcement, health care and financial workloads, the program is intended to guide providers on the rules and requirements for handling sensitive government data, many of which involve automation and the removal of human error. The agency is currently reviewing comments on the program, and a final ratification is expected by year-end.

Mission-critical data is the crown jewel of the enterprise, so it is understandable that organizations will not give it up easily. That puts the burden on the cloud community to show that it is ready to step up to the plate, preferably with marquis clients that know the value of their data and strive to maintain the highest standards for current in-house support.

And as with most things data-related, once the mission-critical dam is breached, the flood is inevitable.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.