TRENDING

Emerging Tech

Microsoft's data center cooling a breath of fresh air

We all know that heat is the enemy of computers. In fact, if we think of computers in terms of an ecosystem, heat is probably their only naturally occurring predator. And it’s ironic that they generate it themselves, even though it’s simple physics. You bring electricity into a system to do work for you, and it gets transformed into something else without changing the total amount of energy involved — in this case, mostly heat.

Heat is a unique enemy of processors. It’s actually more akin to a poison, first slowing down its victim, then causing it lots of subtle damages and finally killing it. Data centers, which pack thousands of computers into a relatively small space, are uniquely susceptible to the risks of heat. They deal with it in various ways, and have in the past been criticized by some to be inefficient beasts of burden for it.

As agencies consolidate data centers and virtualize servers, finding new ways to keep them cool could add one more way to reduce costs and increase efficiency.

Many companies are offer examples, streamlining their data centers with new techniques and processes that IDC Government Insights says feds will need to emulate in order to increase their own effective use of computers, data centers and the cloud.

Recently, Google pulled back the curtain on how it manages the heat at some of the largest data centers in the world. The company’s techniques involve stripping almost every unnecessary component and scrap of metal away from the processors inside their data centers. The computers in the racks at Google centers are little more than pallets holding motherboards.

Even the internal walls of the data center are constructed of fabric — enough to direct air flow in the proper direction, but not enough to add to the complex problems associated with heat management. They are also cheap, and easy to reconfigure on the fly. Then Google runs a lot of cool water into the facility, sometimes having the liquid-carrying pipes within inches of the processors themselves.

That’s a pretty efficient model of doing things, but Microsoft is now going one further, literally setting their servers outside in roofless data centers. According to Data Center Knowledge, the idea behind Microsoft’s new billion-dollar roofless data center facility in Boydton, Va., came from Christian Belady, general manager of Microsoft Data Center Services. He thought that computers should be able to brave the outdoor elements, and set up a server rack in a pup tent back in 2008. It ran for eight months with 100 percent up time. That demonstrated that outdoor computer cooling and housing was theoretically possible.

Now, it’s more than just dropping a computer in a field and hoping that it doesn’t get rained or snowed on, or marked by passing animals. Because if you leave it in the elements, that will happen. And it will break.

But Microsoft has been designing smaller and smaller containers to hold its servers for years, Data Center Knowledges reports. Called IT-PACs, for pre-assembled components, the shipping-crate-like boxes can each hold hundreds of servers. Cool air from the outside is brought into the unit through vents on the side, where it passes through a wet membrane that cools the air down before being used to ultimately cool the servers. This method reportedly uses just 10 percent of the water needed to cool most data centers of the same size.

Future Microsoft data centers may be little more than concrete slabs on the ground, with the IT-PACs sitting on top. And although there is some concern that Virginia might prove too hot for this method to work — the company also has experimented with outdoor cooling on a more limited basis in Washington State, Chicago and Ireland — Microsoft seems confident that it will do just fine in The Old Dominion.

I guess we’ll see what happens when the new sparse and efficient data center meets its first brutal southern summer. But in any case, this is a great example of one possible path for agencies to follow as they strive to increase their own efficiency with data centers.

inside gcn

Reader Comments

Tue, Dec 17, 2013
Christine

What's the update on this very interesting story? What was the outcome from Microsoft's perspective? If the Boydton servers made it 8 months with 100% uptime under a pup tent, will open-air data centers be a viable technology approach for climates with heat/humidity in summer and cold/snow in winter? I'd love to see a follow-up story.

Mon, Jun 10, 2013
Allen

You are right Chris, and Virginia would be considered a hot and humid area, especially where the facility is placed in the southern part of the state. But it also will have to face ice and snow, which can be just as bad without a roof.

Wed, Apr 24, 2013
Chris

The water cooling system would work fine in the arrid desert, but in hot humid areas it is much less effective. The Jury is still out.

Fri, Mar 8, 2013
Louis Frost

This is an interesting concept. However, I will wait to see how it fairs out over a longer period than 8 months.
It is not whether it can survive 8 or 10 months, but how it survives over 18- 24 months. Environmental failures start becoming significant in 12 months, and significantly increase over 18 months. This is okay for internet companies like Google, as their processing requirements change so rapidly, they out grow a machine in very little time. But what about the other 80 % + centers out there?
I look forward to seeing what portion of the lessons learned here are across the board applicable.
Louis Frost
President,
C/TEC Critical Facilities Group

Tue, Feb 12, 2013
Mel

Prediction: First 105 degree day we have, those computers melt inside their shipping containers.

Please post your comments here. Comments are moderated, so they may not appear immediately
after submitting. We will not post comments that we consider abusive or off-topic.