Kudos to Microsoft for the far-sighted investment. As organisations are beginning to realise the risks associated with their cloud supply chain, opaque cloud suppliers like AWS and SoftLayer will be abandoned for more responsible, transparent, risk-free suppliers like Microsoft.

Welcome to episode twenty two of the Technology for Good hangout. In this week’s episode we had CEO of D2C, co-founder of AgileElephant, and fellow Enterprise Irregular, David Terrar as guest on the show. As well as being a fellow Enterprise Irregular, David is an old friend, so we had a lot of fun discussing this week’s crop of stories. Last week Google held its I/O developer conference so there were plenty of Google stories breaking, but we also found time to fit in topics such as renewables, communications, and health.

Welcome to episode eleven of the Technology for Good hangout. In this week’s show our special guest was unable to make it due to looming deadlines, so I did the show solo. Given the week that was in it with Microsoft’s Build conference taking place, there were plenty of stories stemming from Microsoft’s various announcements, but there was also a ton of other news, as always.

Cloud computing is often incorrectly touted as being a green, more environmentally-friendly, computing option. This confusion occurs because people forget that while cloud computing may be more energy efficient (may be), the environmental friendliness is determined by how much carbon is produced in the generation of that energy. If a data centre is primarily powered by coal, it doesn’t matter how energy efficient it it, it will never be green.

One such cloud provider is SAP. Like most other cloud vendors, they’re constantly increasing their portfolio of cloud products. This has presented them with some challenges when they have to consider their carbon footprint. In its recently released 2013 Annual Report SAP admits

Energy usage in our data centers contributed to 6% of our total emissions in 2013, compared with 5% in 2012

This is going the wrong direction for a company whose stated aim is to reduce the greenhouse gas emissions from their operations to levels of the year 2000 by 2020.

that it will power all its data centers and facilities globally with 100 percent renewable electricity starting in 2014

This is good for SAP, obviously, as they will be reducing their environmental footprint, and also good for customers of SAP’s cloud solutions who will also get the benefit of SAP’s green investments. How are SAP achieving this goal of 100 per cent renewable energy for its data centers and facilities? A combination of generating its own electricity using solar panels in Germany and Palo Alto (<1%), purchasing renewable energy and high quality renewable energy certificates, and a €3m investment in the Livlihoods Fund.

So, how does SAP’s green credentials stack up against some of its rivals in the cloud computing space?

Scope 3 GHG emissions are typically defined as indirect emissions from operations outside the direct control of the company, such as employee commutes, business travel, and supply chain operations. Oracle does not report on Scope 3 emissions

And then there’s Amazon. Amazon doesn’t release any kind of information about the carbon footprint of its facilities. None.

So kudos to SAP for taking this step to green its cloud computing fleet. Looking at the competition I’d have to say SAP comes in around middle-of-the road in terms of its green cloud credentials. If it wants to improve its ranking, it may be time to revisit that 2020 goal.

If you checked out the New York Times Snow Fall site (the story of the Avalanche at Tunnel Creek), then Microsoft’s new 88 Acres site will look familiar. If you haven’t seen the Snow Fall site then go check it out, it is a beautiful and sensitive telling of a tragic story. You won’t regret the few minutes you spend viewing it.

Microsoft’s 88 Acres is an obvious homage to that site, except that it tells a good news story, thankfully, and tells it well. It is the story of how Microsoft is turning its 125-building Redmond HQ into a smart corporate campus.

Microsoft’s campus had been built over several decades with little thought given to integrating the building management systems there. When Darrell Smith, Microsoft’s director of facilities and energy joined the company in 2008, he priced a ‘rip and replace’ option to get the disparate systems talking to each other but when it came in at in excess of $60m, he decided they needed to brew their own. And that’s just what they did.

Using Microsoft’s own software they built a system capable of taking in the data from the over 30,000 sensors throughout the campus and detecting and reporting on anomalies. They first piloted the solution on 13 buildings on the campus and as they explain on the 88 Acres site:

In one building garage, exhaust fans had been mistakenly left on for a year (to the tune of $66,000 of wasted energy). Within moments of coming online, the smart buildings solution sniffed out this fault and the problem was corrected.
In another building, the software informed engineers about a pressurization issue in a chilled water system. The problem took less than five minutes to fix, resulting in $12,000 of savings each year.
Those fixes were just the beginning.

The system balances factors like the cost of a fix, the money that will be saved by the fix, and the disruption a fix will have on employees. It then prioritises the issues it finds based on these factors.

Microsoft facilities engineer Jonathan Grove sums up how the new system changes his job “I used to spend 70 percent of my time gathering and compiling data and only about 30 percent of my time doing engineering,” Grove says. “Our smart buildings work serves up data for me in easily consumable formats, so now I get to spend 95 percent of my time doing engineering, which is great.”

The facilities team are now dealing with enormous quantities of data. According to Microsoft, the 125 buildings contain 2,000,000 data points outputting around 500,000,000 data transactions every 24 hours. The charts, graphics and reports it produces leads to about 32,300 work orders being issued per quarter. And 48% of the faults found are corrected within 60 seconds. Microsoft forecasts energy savings of 6-10% per year, with an implementation payback of 18 months.

Because Microsoft’s smart building tool was built using off the shelf Microsoft technologies, it is now being productised and will be offered for sale. It joins a slew of other smarter building solutions currently on the market from the likes of IBM, Echelon, Cisco et al, but given this one is built with basic Microsoft technologies, it will be interesting to see where it comes in terms of pricing.

Price will certainly be one of the big deciding factors in any purchasing decision, any building management tool will need to repay it’s costs within at least 18 months to merit consideration. Functionality too will be one of the primary purchase filters and what is not clear at all, from the Microsoft report, is whether their solution can handle buildings on multiple sites or geographies. If I hear back either way from Microsoft on this, I will update this post.

This is a market that is really starting to take off. Navigant Research (formerly Pike Research) issued a report last year estimating the size of the smart building managed services market alone will grow from $291m in 2012 to $1.1bn by 2020. While IMS Research estimated the Americas market for integrated and intelligent building systems was be worth more than $24 billion in 2012.

One thing is for sure, given that buildings consume around 40% of our energy, any new entrant into the smarter buildings arena is to be welcomed.

This is big news. Microsoft has 600 facilities across 110 countries worldwide. For the first time, the full energy and environmental footprints of all these sites will now be managed from within a single cloud-delivered resource, the CarbonSystems ESP system. The levels of transparency this will give Microsoft will be immense. Perhaps now, unlike many of its competitors, Microsoft will be able to join the EU’s ICT Footprint initiative.

This move should also enable Microsoft to report on the energy and emissions associated with its own cloud infrastructure – something, like all other cloud providers, Microsoft has failed to do to-date.

This move is a big deal for CarbonSystems too. CarbonSystems are an Australian company and have done quite well there but have more recently been eying the EU and US markets. Being selected by Microsoft for a global rollout has suddenly catapulted them up the credibility charts. Had you asked me previously which 3rd party platform Microsoft might have chosen I’d probably have mentioned SAP, Hara, CA, or Enablon.

Now with this win, CarbonSystems too has a seat at the big boys’ table.

Power Usage Effectiveness (PUE) is a widely used metric which is supposed to measure how efficient data centers are. It is the unit of data center efficiency regularly quoted by all the industry players (Facebook, Google, Microsoft, etc.).
However, despite it’s widespread usage, it is a very poor measure of data center energy efficiency or of a data center’s Green credentials.

Consider the example above (which I first saw espoused here) – in the first row, a typical data center has a total draw of 2MW of electricity for the entire facility. Of which 1MW goes to the IT equipment (servers, storage and networking equipment). This results in a PUE of 2.0.

If the data center owner then goes on an efficiency drive and reduces the IT equipment energy draw by 0.25MW (by turning off old servers, virtualising, etc.), then the total draw drops to 1.75MW (ignoring any reduced requirement for cooling from the lower IT draw). This causes the PUE to increase to 2.33.

When lower PUE’s are considered better (1.0 is the theoretical max), this is a ludicrous situation.

Then, consider that not alone is PUE a poor indicator of an data center’s energy efficiency, it is also a terrible indicator of how Green a data center is as Romonet’s Liam Newcombe points out.

Consider the example above – in the first row, a typical data center with a PUE of 1.5 uses an average energy supplier with a carbon intensity of 0.5kg CO2/kWh resulting in carbon emissions of 0.75kg CO2/kWh for the IT equipment.

Now look at the situation with a data center with a low PUE of 1.2 but sourcing energy from a supplier who burns a lot of coal, for example. Their carbon intensity of supply is 0.8kg CO2/kWh resulting in an IT equipment carbon intensity of 0.96kg CO2/kWh.

On the other hand look at the situation with a data center with a poor PUE of 3.0. If their energy supplier uses a lot of renewables (and/or nuclear) in their generation mix they could easily have a carbon intensity of 0.2kg CO2/kWh or lower. With 0.2 the IT equipment’s carbon emissions are 0.6kg CO2/kWh.

So, the data center with the lowest PUE by a long shot has the highest carbon footprint. While the data center with the ridiculously high PUE of 3.0 has by far the lowest carbon footprint. And that takes no consideration of the water footprint of the data center (nuclear power has an enormous water footprint) or its energy supplier.

The most significant change to the System Center Configuration Manager in R3 is the new power management set of strategies.

By way of background Brad talked about how an increasing number of RFP’s being received by Microsoft were requesting information on what Microsoft was doing to reduce its footprint. According to Brad, reducing your energy footprint is now an imperative to doing business, not just a way of saving the company money.

System Center Configuration Manager 2007 R3 config screen

Microsoft’s System Center Configuration Manager allows systems administrators to centrally control all kinds of policies on client servers and PCs on a network. Everything from what appears in the Start menu right through to security management policies can be deployed using this software (aside – as a sysadmin of a small co. back in the early 00’s I used the config manager to set people’s wallpaper on their PCs to a html version of the co. phone book!).

The ability to control the energy policies of client PC’s is hugely important because that’s where the maximum number of CPU’s is in most organisations. The Ford Motor company, for example, recently announced that by rolling out 1E’s Nightwatchman PC energy management application it was going to save

$1.2 million and reduce its carbon footprint by 16,000-25,000 metric tons annually

1E are a Microsoft partner and their NightWatchman product goes significantly further with PC power management according to Microsoft’s Rob Reynolds, Director of Product Planning for System Center, who briefed me on the new System Center Configuration Manager (config manager will only put PC’s into Sleep Mode, for example, whereas NightWatchman can shut them down completely and NightWatchman has significant power management controls for XP clients which config manager is missing).

The new software gives you

The ability to see and set how and where the power is being used

The ability to see what your user activity looks like

A set of recommendations on policy to show you how to reduce your power consumption and

Tracking and reporting on how much carbon you have prevented from being released as a result of your power management capabilities

On the server front, Rob outlined a scenario where based on reduced demand (overnight, say), virtual machines can be re-provisioned onto fewer hosts and then some of the servers could be put into a low power state. Then as demand picks up once more (following morning) the servers in low power mode can be woken back up and the virtual machines moved back onto them.

While many products such as NightWatchman already exist with this functionality, having it built into Configuration Manager will now put this within easy reach of all Microsoft customers and that can only be a good thing.

Another company in the list worthy of note is BT, whose report, despite the lack of interactivity, is the only other report to hit the GRI A+ rating.

HP’s site has gone heavy on design to the detriment of usability which is unfortunate because some of the content is really good.

After that, almost all of the companies who have a 2009 report published have done a really good job. The exception to this is Microsoft whose 2009 report, while an improvement on previous reports, still has a long way to go to approach a professional CSR Report standard.

Of the companies who have yet to publish their 2009 report, Oracle and Adobe’s 2008 reports are lacklustre attempts, at best. Neither report to GRI standards and both are long on pretty pictures and short on relevant data.

Having said that, at least Oracle and Adobe are producing Sustainability reports.

The three laggards in this list are Google, Amazon and Apple – none of whom are producing sustainability reports at the minute.

In their defence, Google has its Going Green at Google website and Apple has its Apple and the Environment site, both of whom go into considerable detail on each companies initiatives. In Apple’s case, it does go deep into a lot of the data you would normally see in a Sustainability report. Why it refuses to produce a formal report is beyond me.

In contrast, Amazon’s attempt at an Environmental site/page is an embarrassment. If this is the best they can do, honestly, they’d be better off doing nothing.

One issue I noted was that HP, Cisco and Apple [PDF] all report on sourcing 100% renewable power in Ireland. This is not possible for the reasons I outlined in this post.

What other companies should I add to this list? Please feel free to suggest any in the comments and I will update the list.

UPDATES:
Since publishing this, Nokia have brought out their excellent 2009 report and it is now included above.
Also, based on suggestions received on FaceBook I have added details about 3 other companies (NEC, Fujitsu and Indra Sistemas). It was also suggested there that I go over various telco companies CSR reports. I’ll leave that to a separate post.

was ?quite sceptical? about this issue. ?None of the cloud providers such as Amazon, Microsoft or IBM are publishing metrics at all. Intuitively you have to think that because you?re outsourcing that to someone of that scale that they?re being more efficient but we?ve no way of knowing. Frankly, that?s worrisome. I don?t know why they?re not publishing it and I wish they would,?

This is no sudden realisation on my part. In fact, I have been concerned about Cloud Computing’s Green credentials for some time now as you can see from a series of Tweets (here, here and here, for instance) I posted on this issue in early to mid 2009.

It is vital that cloud providers start publishing their energy metrics for a number of reasons. For one, it is a competitive differentiator. But perhaps more importantly, in the absence of any provider numbers, one has to start wondering if cloud computing is in fact Green at all.

I’m not sure why cloud providers are not publishing their energy metrics but if I had to guess I would say it is related to concerns around competitive intelligence. However this is not a sustainable position (if you’ll pardon the pun).

As the regulatory landscape around emissions reporting alters and as organisations RFP’s are tending to demand more details on emissions, cloud providers who refuse to provide energy-related numbers will find themselves increasingly marginalised.

So is cloud computing Green?

I put that question toSimon Wardley, cloud strategist for Canonical in this video I recorded with him last year and he said no, cloud computing is very definitely not Green.

To be honest, until cloud providers start becoming more transparent around their utilisation and consumption numbers there is really no way of knowing whether cloud computing is in any way Green at all.