View From Above

In an effort to win the hearts and minds of IT, Microsoft trotted out a new set of cloud policies last week it’s calling “Support LifeCycle Policy for Online Services,” pledging to give customers a minimum of 12 months notice before shutting down any Microsoft cloud service or one that requires a major change to the way the application functions (such as requiring an upgrade to Outlook to stay compatible with hosted Exchange services).

It’s all well and good as far as it goes, but what does it really mean? When you look at Office 365 for instance, a lot of these services are tied to Office and Microsoft has little choice but to say the course no matter what, if it hopes to continue to compete with Google Docs in the Cloud over the long term.

As for the other issues of how Cloud services could disrupt other systems — such as a required upgrade to Outlook to stay compatible with hosted Exchange services — it seems that companies moving to the cloud are doing so precisely so they don’t have to worry about stuff like that.

Hard to say, but the timing is interesting, coming a week after Google’s. Competition does tend to spur these big companies to act and Microsoft is already well behind when it comes to Cloud and Mobile — two areas, that as I’ve written, where the company has to be making big money if it hopes to stay as profitable as the recent $20 billion quarter, for instance.

When you delve into the numbers of that quarter, however, what you aren’t seeing is profitability in mobile or cloud. According to reports like this one from Technologizer, Microsoft lost more than $500 million from its online ventures, which include Windows Live services and Bing among others.

In fact, according to the Technologizer article Microsoft hasn’t made money online for *20* straight quarters. If you’re doing the math at home, that means that 2005 was the last time their online division was profitable, and this is with Microsoft throwing millions and millions of dollars at advertising and promoting these services including the cloud. (Click through to the article and just take a look at the chart and it paints an ugly picture.)

So party tricks like New Support Lifecycle for Online Services could lure some IT folks who are thinking about the cloud, and have apprehensions about viability or control over change, but I don’t think it’s going to sway anyone to go with Microsoft solutions that hadn’t already been considering it before.

Microsoft as usual likes to trot out these types of announcements, spending way more money and time than it should on trivialities. If they really want to make a splash, they have to concentrate on the products and the content of their cloud solutions, not marketing gimmicks that aren’t going to convince anyone who’s shopping carefully one way or the other.

Building a private cloud infrastructure inside your organization is not a trivial matter, so it’s surprising to learn that a survey of senior-level IT pros found that an astounding 52 percent have cloud resources they rarely or never use.

But the survey, which was conducted by Electric Cloud, a private development cloud company, and Osterman Research ultimately leaves more questions than answers.

James Staten, a Forrester analyst who covers cloud computing defines a private cloud in his July, 2010 report, You’re Not Ready for an Internal Cloud, as “a standardized, self-service, pay-per-use deployment model.”Essentially what Staten means is a set of standard services set up on a web site behind the firewall and treated internally not unlike a public cloud service. Users “buy” the services they need and are billed based on what they use. This is in stark contrast to a client-server model in which users generally are given carte-blanche to use company services with few options for charging back for varying degrees of use.

One of the key principles of a private cloud is virtualization, which enables IT administrators to make much better use of finite physical resources by allowing you to treat the physical resources as a pool, then slice, dice and reallocate them as needed across the organization. Since users and departments are billed based on usage, they are much more likely to give up those resources when they are finished with a project.

So that brings us back to these unused resources as reported in the survey. It’s important to point out a couple of data points here. First of all, of those surveyed, 48 percent reported having or planning a private cloud. It doesn’t break it down any further, but we have to assume that a percentage of those are planning and haven’t implemented.

So of that number that have implemented a private cloud, more than half have resources they rarely or never use, but what percentage that compromises of their total resource package, again, the survey didn’t say and that’s a major question.

One other data point could help provide an answer is that 47 percent report some or lots of excess capacity. Again, that’s a vague spectrum. How much is some or lots, but it does give you a sense that those that are running a private cloud haven’t figured out how to maximize usage.

Of course, they probably weren’t maximizing usage before they had a cloud. It’s just easier to quantify with cloud infrastructure than a traditional client-server one.

One data point is clear, however. Many companies are just getting involved in Cloud Computing. 39 percent of those using a private cloud, just started in 2010. Perhaps that explains why these companies haven’t mastered how to maximize it yet.

Or perhaps excess capacity isn’t a bad thing. It means they may not have to invest in more physical hardware when they need additional capacity, and chances are most companies will need additional capacity at some point in the future. It’s good to have some available when needed.

For now, at least, these numbers show an area of IT that’s still immature. As IT pros become more comfortable using and managing private clouds, these numbers should shift. Be interesting to see a similar report next year and see if the results have changed.

Last week, Lubor Ptacek, a blogger and VP of Marketing at content management vendor, Open Text, wrote a post on his personal blog called Mobile Device as Primary Interface. It was Ptacek’s contention that his colleague, whom he captured in a picture using his Blackberry in his office instead of a nearby laptop, was part of transformation to a mobile-centric world.

Given the choice of his laptop or his Blackberry, even with his laptop within easy reach, the colleague chose the mobile device — and that’s telling. As smart phones get more sophisticated and the network connection speeds get faster, it’s something that’s bound to become so common place we won’t need a picture and a blog post to point it out.

In fact, it’s highly likely that a similar dynamic is happening as we speak inside your organization (or at least it should be). As I wrote last week, in iPads are Coming – Ready or Not, the tablet may be helping drive the trend away from a PC world in which we use either our mobile smart phone, or a tablet when we need a larger screen. As I pointed out in my post, many organizations are looking very carefully at tablet solutions today.

The fact that this trend is coming together as we speak is particularly interesting to me because in 2002, I wrote my very first article for EContent Magazine (where I still write today as Contributing Editor) called Project Oxygen: A breath of fresh air for the Internet. The article described an MIT project with what was a fairly radical notion at the time–that our primary access to computers and the Internet would be via hand-held devices and that network access would be ubiquitous as the air we breathe (like Oxygen).

As I wrote at the time:

The stated goal of the project is to create a computing system that is “human-centered and pervasive,” meaning that the computer reacts to human needs, rather than forcing people to work according to the computer’s design, and that the computer and network connections are always available no matter where you are just as the air you breathe is always available.

The project included a mobile device that could change to support whatever activity the user required whether that was a PDA, a phone or a pager (much like a smart phone today) and the network component sought out the strongest connection based on location and task (not unlike Cell and WiFi networks today, although that still needs some work to achieve the full goal of the project).

The point though, is not that we achieved to the letter every component of Project Oxygen, but that we have achieved so much of it so quickly, and as the article speculated, it changed the way consume content — both as consumers and at work.

As Ray Ozzie pointed out in his farewell ‘Dawn of a New Day’ memo to Microsoft employees last fall, he sees a time in the not-too-distant future where we live in what he called a “post-PC world.” As he put it, “…slowly but surely, our lives, businesses and society are in the process of a wholesale reconfiguration in the way we perceive and apply technology.”

I believe that wholesale reconfiguration centers around these new mobile devices like tablets and smart phones (not to mention ‘The Cloud’), and that Ptacek’s post shows that perhaps we are further along than any of us expected or realized.

The lesson here for IT, in case you missed it, is that if you’re not focusing on these devices today, you’re probably already behind. Just as Ozzie warned his company on his way out the door, “close your eyes, and picture what a post-PC world might look like,” because if you ignore Mr. Ozzie, you could be left in the dust.

Make no mistake about it, iPads and other tablets are coming to the enterprise whether you’re prepared or not, and you need to have a mobile strategy in place including devices, apps and security and governance considerations.

Just in case you doubt me, consider what Tim Cook, acting Grand Poobah and actual Chief Operating Officer at Apple had to say at yesterday’s rather impressive earnings report. According to published reports of the earnings call including this one in The Register, Cook said that 80 percent of the largest companies have used or are evaluating the iPad.

The Tech Republic reports that the number of Fortune 100 companies running iPad pilots grew by 65 percent from September to December and Cook was quoted as saying they see this as really just the tip of the ice berg.

“Generally enterprises are much slower, much more cautious and uses things that have been in the market for a long time. I think to everyone’s credit they have seen the value of this from a productivity and creativity point of view, and they are really moving fast. So I think we are just scratching the surface right now,” Cook said.

And a December survey from Citrix as reported on ZD-Net seems to back up that at the very least, enterprise users are moving toward the iPad in a big way. Consider that 60 percent of respondents said they were prepared to purchase an iPad for work, 46 percent used it daily and 13 percent considered it mission critical to their work, a pretty remarkable figure when you consider the iPad didn’t even exist until last April.

But it’s worth noting, the survey wasn’t all sunshine and light for Apple. When asked why their company wouldn’t purchase the iPad, security was the chief concern, followed by a policy not allowing data on any device but a company PC. Both of these can be alleviated, however; the former by less stringent policies and the latter by security tools being developed even as we speak.

In fact, consider that Guidance Software released an update to EnCase Neutrino this week that supports eDiscovery and digital forensics on iPads, iPhones and even the iPod Touch. It’s a tool that corporate security and governance teams need as more and more of these devices proliferate in the enterprise.

And of course, it’s not just Apple products. As we move into 2011, we will begin to see even more tablet offerings from a variety of vendors running a variety of operating systems. As with mobile phones, enterprise IT will be left to evaluate the myriad of options available and decide which devices and operating systems they choose to support.

This doesn’t mean you should be overly cautious, but you do have to decide where to place your bets because chances are you can’t support every device under the sun (not without going insane), but from an organizational standpoint, you have to at least look at how these devices might help your staff improve productivity moving forward.

You don’t necessarily have to buy into Apple–although your users might be clamoring for it–but you have to start looking and formulating a plan and policy, because with or without you, chances are you’re going to start seeing tablets in the enterprise, whether you’re ready or not.

Let’s face it, when you look at the mobile smart phone market, the consumer side is saturated, or at least clogged at the top, but the enterprise is still ripe for the taking, and it should be the next big flash point for the big three: Apple, Google and Microsoft.

For the time being at least, Research in Motion (RIM) still owns the enterprise. I’ve noticed on recent trips as I look around the airport, that Blackberries and iPhones seem to dominate. I know that Android has been gaining market share in bushels, but it’s apparently not among business people (if my informal observation is any indication, that is).

Business people seem most comfortable with a Blackberry, but a recent survey of 2400 enterprise users conducted by MicroStrategy suggests that the enterprise could be looking more at the iPhone in the coming year. One key result found that among current deployments, Blackberry was way ahead with 72 percent. iPhone was second, way back at 54 percent, while the iPad was third with 37 percent. Android had just 24 percent and Windows Mobile, just 17.

But if you look at planned deployments, RIM is at 56 percent (down 10 percent since a similar survey in June), while iPhone leads the pack with 62 percent. Interestingly, iPad had the largest percentage increase since the June survey, up 15 percent to 55 percent of respondents. The bad news for Microsoft was planned deployment for Microsoft Mobile was down 6 percent since June sitting at 19 percent.

So you have Blackberry with a substantial presence and Microsoft with a negligible one that appears to be fading. That’s why if I were in charge of Microsoft mobile strategy, I would forget about catching Google and Apple in the consumer space and concentrate on the enterprise where Microsoft already has a strong foothold with Windows, Office, Exchange and other enterprise stalwarts.

To take it one step further, if I were in charge at Microsoft, I would make a strong play for RIM. If you combine Microsoft’s obvious enterprise strengths with the mobile enterprise presence of RIM, Microsoft could be a powerful player in that space and the two companies together could give Google and Apple a run for their money in the enterprise.

What’s more, if you look at the burgeoning tablet space, for now, the enterprise is completely wide open and mocoNews reported on Friday that RIM could be shipping 1 million Playbook tablets next month. As the article points out, Apple sold 7.6 million iPads in the first six months, so a million is a drop in the bucket, but If RIM is able to get a foothold in the enterprise with the Playbook, it’s another opportunity for RIM (or Microsoft if it follows my advice).

The last I checked Steve Ballmer wasn’t reading my blogs (and he’s probably angry about Friday’s post if he were), but this is a plan that makes sense for both parties. Microsoft has plenty of money and RIM could expand its market with the clout of Microsoft behind it.

I haven’t mentioned Google much, but if the survey is any indication, companies seem to be hedging their bets with Android, rather than going after it whole hog. That doesn’t mean Google doesn’t want a presence there. In fact, they have a web site devoted to Enterprise Mobile and will no doubt fight for marketshare with Apple and Microsoft.

But for now, at least this market remains wide open. If Microsoft wants to beat Apple and Google, it needs to do something bold, and purchasing RIM could be the boost it needs. Whether that will happen, I can’t say, but this is clearly a market for the taking. The question is, who’s going to step up?

Is Microsoft really fully committed to the cloud as CEO Steve Ballmer has continually said over the last year to anyone who will listen? Looking at the spate of recent departures, it’s hard to know, but given their core desktop business, I find it hard to believe that it is.

Joe Wilcox writing in Beta News on Thursday suggested the public departure–what he referred to as a “public execution”–of Bob Muglia, president of the Server and Tools division, was mostly due to Muglia’s failure to embrace the cloud. Wilcox wrote:

“The public nature of Muglia’s departure also communicates to Wall Street just how serious Ballmer is about the cloud and transforming the server business to embrace it — the same way Allard’s and Bach’s departures showed renewed commitment to transform the mobile business.”

But does it? I’m not so sure Wilcox is right on this one and I say this for a couple of reasons. First of all, let’s look at the departure of Ray Ozzie, who announced he was leaving last October. Ozzie, after all, was the man Ballmer hired to develop and nurture a cloud strategy at Microsoft. When he left, it seemed to me that his parting email was a shot across the bow to Microsoft that perhaps it wasn’t embracing the cloud as much as public statements had suggested.

The email was full of juicy quotes, but one of my favorites was this one directed squarely at the heart of Microsoft’s problem:

This doesn’t sound to me like he sees a rosy cloud-mobile future for Microsoft. Quite the opposite. Instead, it sounds like he sees a company that refuses to shed its client-server roots and take the necessary steps to make the transition it absolutely needs to make to survive in what Ozzie refers to as the next-generation “post-PC world.”

Wilcox could be right about Ballmer consolidating his power, but I don’t think it’s about because he wants to execute his big cloud-mobile vision and these executives stood in the way. Instead of being a visionary, I see a leader in disarray, one who reacts to the changing landscape around him, rather than trying to develop and control it.

The bottom line is that Microsoft won’t abandon Windows and Office on the desktop and as such, it’s committed to this increasingly complex desktop software Ozzie warned about. By contrast, if you look at the recently launched Mac App store, you see smaller apps that do one or two things well along side word processors and other traditional desktop software.

Microsoft is going to be a formidable company for the foreseeable future, regardless of what they do just by the sheer momentum they have from the Windows/Office tandem, but it’s going to take a lot more than lip service and some executive turnover to transform the company.

It’s going to take clear vision, and I’m not convinced Ballmer is the man who has what it takes, his recent power plays not withstanding.

Cloud computing has gone mainstream, for better or worse. In case you had any doubts, I had an experience this past weekend that drove home just how pervasive the cloud has become, or perhaps it showed that it’s truly entering the public consciousness as a marketing buzz word.

First, a friend sent me this this Dilbert cartoon (which IT pros should love) in which Dilbert’s boss hires Dogbert as a cloud consultant whose contribution is “Blah Blah Cloud.” It’s funny because that’s how some consultants behave today throwing around cloud terms without really understanding the risks and benefits of the method.

But what really drove home the mainstreaming of cloud computing for me was a conversation I had with a friend of my 82 year old father’s. When I met him he asked me what I do for a living. When I told him that I was a freelance technology writer, what’s the first thing he asked me? He said, “Oh you’re a technology writer, what the heck is the cloud?!” I had to laugh out loud seeing as I write about it so often.

I explained to him that the cloud was really about using online services instead of software or storage on your own computer. As a former physics professor he had no trouble understanding my explanation. He had no use for it either, but that’s not the point.

The point was here was this 80-something man asking me out of the blue what the cloud is about. Just recently my wife and I were talking about how to back up all our digital photos, and she suggested, “why not the cloud.” It was another of those ‘huh?!’ moments because I wondered where it came from.

I have the feeling both came from the recent Microsoft ‘To the Cloud‘ ads, which are focused on the consumer cloud. As IT pros though you can’t simply fall for marketing buzz words, and you can’t rely on poser consultants like Dogbert. You need actual data and answers.

One thing you can be sure of, however is that cloud computing as a concept does have real utility in the enterprise. Services like Amazon S3, Rackspace, Salesforce.com and so many others are selling useful services that provide your company with a lower cost way of maintaining software and services than you can typically do on your own.

Unfortunately, the term ‘cloud’ has been abused so much at this point, it’s almost rendered it meaningless, but don’t let the fact that the category has been bogged down by marketer speak drive your IT strategy. The cloud has its place in most IT infrastructure decisions and you have to pick and choose where it fits based on real information.

I know that as I’ve written about ‘the cloud’ over the last year, I’ve gotten a lot of push back from IT pros. You have a right to be cynical and to question it, but you also can’t let your own bias get in the way of making sound decisions for your organization. Just because it’s become the ‘buzz word du jour’ doesn’t mean there aren’t real solutions within the category than can help your organization.

When Skype went down for many people just before Christmas it was another lesson in how much we depend on cloud services, and how lost we feel when they go down for some reason, but it was also a lesson in how we react to these outages.

Much has been written about why Skype went down including this great explanation of how Skype works on the Disruptive Telephony blog and this search for answers from Steven J. Vaughan-Nichols on Ziff-Davis. No need to rehash all of that. Instead I want to look at our reactions to cloud outages and why we seem to panic each time they happen.

After the Skype outage started, I got an email from a friend questioning if the outage had just cost Skype billions of dollars (presumably in money they would have earned from a sale or IPO). You see her teen son who uses Skype every night to talk to his girl friend, panicked when he realized he couldn’t talk to her the night of the outage. She showed him Google Voice, and the rest might have been history, except for a couple of things.

First of all, he didn’t like Google Voice as much as Skype, and then Skype was back up the next day, but the reaction was what I found so interesting. We’ve all experienced cable and Internet outages. Most everyone has lost electricity and your land line too, yet when these utilities go down we don’t hear suggestions that the cable or electric company will lose value as a result. Why? Because we expect that these services will fail from time to time in spite of the company’s best efforts.

And that’s the attitude we have to take with cloud services too. The big difference with the cloud services of course is that there is plenty of competition, so if those outages became a pattern, chances are we would switch providers. But we should all be able to live with an occasional outage that’s simply beyond the control of the company.

I can guarantee your PC has failed you more than once, or if you work in a company, that a crucial enterprise program has stopped working from time-to-time. When you’re working with technology, it’s just part of the package. Sooner or later something is going to fail simply because all technology fails on occasion.

What we need to do is mature to the point that we understand and accept this as a fact of life and not panic every time it happens. When GMail went down for 2 hours in Sept, 2009, you would have thought the world had gone to pieces. When Skype went down in December, we showed we hadn’t learned much.

Skype did a good job of communicating in multiple channels–on Twitter and on the company blog–being open about the problem and what they were doing to try to solve it. Within the day, what was a huge problem as it turned out, had been solved and they were back in business.

This past Sunday my wife had a video chat on Skype with our nieces and nephew in Australia (well, it was Monday there). Skype released Skype for iPhone with video chat last week. In the end, this was a little bleep in the scheme of things, a moment in time when a valuable service stopped working, but it wasn’t the end of the world, and it certainly wasn’t the end of Skype. How about we all keep that in mind the next time something like this happens because sooner or later it will.

Google made a lot of noise earlier this month when it announced the Chrome OS pilot program. Given the fanfare, you would think the world had never seen a cloud-based operating system before. In fact, the concept has been around for years, but this was Google so it made a big splash. It also had an important difference from the other offerings in that instead of being just a portal to cloud apps, it was attempting to be the underlying software controlling the computer as well.

I’ve yet to see the offering, but Chrome OS appears to be a browser–Google Chrome of course–and a front-end to Google services all sitting on top of Linux desktop. According to an initial review by Walt Mossberg in the Wall Street Journal, the underlying bits that are supposed to do the actual operating system work don’t functional all that well yet. It’s admittedly Beta software, but we already knew that Google gets how to make a browser and online applications. We didn’t need an experimental computer to figure out that part. What they are attempting to do is adapt desktop Linux much in the same way they created Android to run mobile phones. So far, it needs some work.

From an IT perspective, however, Chrome OS has the potential to be the Holy Grail of laptops (if you don’t mind using all Google all the time). That’s because users can’t add any external programs. You get to use whatever programs are in the portal and that’s it. As Steven J. Vaughan-Nichols pointed out, it also protects even the stupidest user from malware.

In addition, it solves the problems of lost or stolen laptops because the data itself is not tied to the physical device, but lives in the cloud. That of course is the beauty of the cloud whether you’re using a laptop running Chrome OS, Windows or OSX, but is it enough for many users or even IT?

For some users, it may be a great idea, that is if Google can figure out how to create a desktop operating system that actually manages the underlying functionality. If you have a very limited set of tasks to do each day, a machine that manages those tasks for you could be a good idea. I have the feeling that most users including executives, knowledge workers and just about anyone who uses a computer for more than the most basic tasks, will balk at this idea.

You don’t need a locked down machine to give you the benefits of the cloud. Regular backups to the cloud, for instance, will give you the same advantages of using a machine that operates exclusively there. In fact, I’m not sure what advantages Google operating system provides, except that it gives Google the opportunity to go after Apple and Microsoft in a big way and tie down those who buy a Chrome OS machine to Google services.

In the end, I don’t necessarily see Chrome OS being as successful as Android has been in the phone market. I’m sure it will find its niche, but not necessarily a mass market because whether you’re an IT pro or a clueless end user, you may not be comfortable operating in Google’s world 24/7.

I don’t tend to go negative when it comes to the cloud, but the story earlier this month that Amazon Web Services cut off WikiLeaks for “violating the terms of service,” gave me pause. Instead of running scared, however, it could be a good ‘teachable moment’ about understanding your Terms of Service.

In a post on the Wall Street Journal’s Tech Europe blog, Ben Rooney reported that Dr. Joseph Reger, who is CTO at Fujitsu Technology Solutions, said that Amazon’s response to WikiLeaks showed a need for industry standards around the cloud. That’s because in his view, if it could happen to WikiLeaks, it could happen to you, and he has a point.

I’m sure Amazon feels it was in the right because it says WikiLeaks was using content that didn’t belong to it. Well, yes, technically it was, but it wasn’t a pirate site by any means Would Amazon have shut down the New York Times web sites if it had been using Amazon Web Services? I think not. So while Amazon’s lawyers are probably off the hook, as Dr. Reger pointed out, what they gained in legal points, they lost in public perception.

That’s because they played into the biggest fear that cloud critics have, and that’s the general sense of unease when your content sits on somebody else’s server and is in another company’s control. If Amazon decides you aren’t playing by the rules, you could be in the penalty box and your business severely compromised.

What’s most disconcerting about this action was the arbitrariness of it. It wasn’t a law enforcement official or a court ordering the content be taken down (although there were reports of State Department pressure). No, it was the lawyers at Amazon making the decision, and that should be frightening to everyone.

What this shows is the importance of understanding every word in your Terms of Service (ToS). In the new brave new world of IT responsibility, negotiating the ToS with cloud providers like Amazon is going to be Job One. Don’t rubber stamp it. Make sure you and your organization’s lawyers understand every word.

If you’re not happy, negotiate. And one point you should always place in the ToS is that under no circumstances will they shut you down without written notice and sound legal reasoning (meaning a court or legal authority has ordered it),

There really are a lot of positives about going to the cloud. This idea of only paying for what you use is very attractive, but there have to be clear rules about up time, governance and who can take your service down (and as Reger said, these should be codified into an industry standard). In my view, if you haven’t received a court order, you better keep me running. You don’t ever shut me down because you feel uneasy about my content (as with WikiLeaks).

WikiLeaks has been an object lesson on so many levels and the shut down at Amazon just provides one more–this time for IT professionals. The cloud has positives and negatives like any other approach, but you can reduce those negatives with smart planning and a clear ToS. If you haven’t learned this by now, you never will.

About This Blog

As business users increasingly find themselves connecting to the internet away from the office, the cloud and mobile devices grow ever more important. This blog will look at ways these issues are affecting IT and the ways companies link data so it's updated wherever you are.