Post Your Comment

47 Comments

I wonder why nobody has tried geothermal liquid cooling. You could do it 2 ways. Either with a geothermal heat pump set up or cut out the middle man and just use the earth like you would a radiator in a liquid cooling loop. The only problem would be how many wells you would have to drill to cool up to 100MW (I'm thinking 20+ at a depth of at least 50ft). Reply

Its kind of easier to just use a nearby river than dig for and pump up ground water. That's what power stations and big chemical factories do. For everybody else, air-cooling is just easier and less expensive. Reply

You wouldn't be drilling for water. You drill a well so you can put pipe in it, fill it back up and then pump water through the pipes using the earth's constant temp (~20c) to cool your liquid which is warmer (>~30c).Reply

I experimented with this (mathematically) and found that heat soak is a serious, variable, concern. If the new moisture is coming from the surface, this is not as much of an issue, but if it isn't, you could have a problem in short order. Then there are the corrosion and maintenance issues...

The net result is that it is cheaper and easier to just install a few ten thousand gallon coolant holding tanks and keep them cool (but above ambient) and to cool the air in the server room(s). These tanks can be put inside a hill or in the ground for extra installation and a surface radiator system could allow using cold outside air to save energy.Reply

You obviously dont know have a clue about drilling costs.For a 2,000 s.f. home, a geothermal driller needs between 200-300 lineal feet of well bore to cool the house. In unconsolidated material, drilling costs per foot range from $15-$30/foot, depending on the rig. For drilling in rock, up the cost to $45/foot.For something that uses 80,000x more power than a typical home, what do you think the drilling costs would be?Go back to heating up Hot Pockets.Reply

Geothermal heat pumps are only moderately more efficient than standard air conditioning and require an enormous amount of area. 20 holes at a depth of 50ft would handle the cooling requirements for a large residential home, but wouldn't even approach the requirements for a data center.One related possibility is to drill to a nearby aquifer and draw cool water, run it through a heat exchanger, then exhaust warm water into the same aquifer. Unfortunately, water overuse has been drained aquifers such that even the pumping costs would be substantial, and the aquifers will eventually be drained to the point that vacuum-based pumps can no longer draw water.Reply

They are a lot more efficient at heating, but only mildly more efficient at cooling. They also are really storing heat in the ground in the summer and taking it back in the winter, so if you only store heat you can actually have a problem long-term. Your essentially using the ground as a long-term heat storage device since the ground is between 50-60 degrees depending on your area of the country, but use of the geothermal changes that temperature. An air source makes much more sense since you share the air with everyone else and it essentially just blows away.Reply

It's been done already. I know I've seen it in an article on new data centers in one industry publication or another. A museum near me recently drilled dozens of wells under their parking lot for geothermal cooling of the building. Being large with lots of glass area, it got unbearably hot during the summer months. Now, while it isn't as cool as you might set your home air conditioning, it is quite comfortable even on the hottest days, and the only energy is for the water pumps and fans. Plus it's better for the exhibits, reducing the yearly variation in temperature and humidity. Definitely a feasible approach for a data center. Reply

I was actually talking about this today; the big cost for our data centers is Air Conditioning; what if we had a building up north (arctic) where the ground is alway frozen even in summer? Geothermal cooling for free, by pumping water through your "radiator".

Not sure about the environmental impact this would do, but the emptiness that is the arctic might like a few data centers! Reply

Unfortunately, the cold areas are also devoid of people and therefore internet connections. You'll have to figure the cost of running fiber to your remote location, as well as how your distance might affect latency. If you go into permafrost area, there are additional complications as constructing on permafrost is a challenge. A datacenter high in the Mountains but close to population centers would seem a good compromise. Reply

I proposed this at work, but management stopped listening somewhere between me saying we'd need to put a trench through the warehouse floor to outside the building, and that I'd need a large, deep hole dug right next to building, where I would bury several hundred feet of copper pipe.

I also considered using the river that's 20' from the office, but I'm not sure the city would like me pumping warm water into their river.Reply

You seem to be reporting on the junction temperature which is reported by most measurement programs rather than the cast temperature that is impossible to measure directly without interfering with the results. How have you accounted for this in your testing?Reply

Do you mean case temperature? We did measure the outlet temperature, but it was significantly lower than Junction temperature. For the Xeon 2697 v2, it was 39-40 °C at 35°C inlet, 45°C at 40°C inlet. Reply

Google's usage of raw seawater for cooling of their data center in Hamina, Finland is pretty cool IMO. Given that the specific heat capacity of water is much higher than air's, it more efficient for cooling, especially in our climate where seawater is always relatively cold. Reply

I think you oversimplify if you just judge the efficiency of the cooling method by the heat capacity of the medium. The medium is not a heat-battery that only absorbs the heat, it is also moved in order to transport energy. And moving air is much easier and much more efficient than moving water.

So I think in the case of Finland the driving fact is that they will get Air temperatures of up to 30°C in some summers, but the water temperature at the bottom regions of the gulf of Finland stays below 4°C throughout the year. If you would consider a data center near the river Nile, which is usually just 5°C below air temperature, and frequently warmer than the air at night, then your efficiency equation would look entirely different.

Naturally, building the center in Finland instead of Egypt in the first place is a pretty good decision considering cooling efficiency.Reply

Isn't moving water significantly more efficient than moving air because a significant amount of energy when trying to move air goes to compressing it rather than moving it, where water is largely incompressible?Reply

For the initial acceleration this might be an effect, though energy used for compression isn't necessary lost, as the pressure difference will decay via motion of the air again (but maybe not in the preferred direction. But if you look into the entire equation for a cooling system, the hard part is not getting the medium accelerated, but to keep it moving against the resistance of the coolers, tubes and radiators. And water has much stronger interactions with any reasonably used material (metal, mostly) than air. And you usually run water through smaller and longer tubes than air, which can quickly be moved from the electronics case to a large air vent. Also the viscosity of water itself is significantly higher than that of air, specifically if we are talking about cool water not to far above the freezing point of water, i.e. 5°C to 10°C. Reply

Sir, I can assure you the Nordic Sea hits ~20°C in the summers. But still that tempereture is good enough for cooling.

In Helsinki they are now collecting the excess heat from data center to warm up the houses in the city area. So that too should be considered. I think many countries could use some "free" heating.Reply

Surface temp does, but below the surface it's cooler. Even in small lakes and rivers, otherwise our drinking water would be unusable and 25°C out of the tap. You would get legionella and stuff then. In Sweden the water is not allowed to be or not considered to be usable over 20 degrees at the inlet or out of the tap for that matter. Lakes, rivers and oceans could keep 2-15°C at the inlet year around here in Scandinavia if the inlet is appropriately placed. Certainly good enough if you allow temps over the old 20-22°C.Reply

there's a lot of work being done on the UPS side of the power consumption coin too - FB uses both Delta DC UPS' that power their equipment directly at DC from the batteries instead of the wasteful invert to 480vac three phase, then rectify again back at the server PSU level, and Eaton equipment with ESS that bypasses the UPS until there's an actual power loss (for about a 10% efficiency pickup when running on mains power)Reply

Yeah there is a lot of movement in this these days, but the hard part of doing this is at the low voltages used in servers <=24v, you need a massive amount of current to feed several racks of servers, so you need massive power bars and of course you can lose a lot of efficiency on that side as well.Reply

Microsoft is building a massive data center in my home state just outside Cheyenne, WY. I wonder why more companies haven't done this yet? Its very dry and days above 90F are few and far between in the summer. Seems like an easy cooling solution versus all the data centers in places like Dallas.Reply

Building in the cooler climes is great - but you also need the networking infrastructure to support said big data center. Heck for free cooling, build the data centers in the far frozen reaches of Northern Canada, or in Antarctica. Only, how will you get the data to the data center?Reply

Its actually right along the I-80 corridor that connects Chicago and San Francisco. Several major backbones run along that route and its why many mega data centers in Iowa are also built along I-80. Microsoft and the NCAR Yellowstone super computer are there so the large pipe is definitely accessible.Reply

That map from Europe is certainly plain wrong. Especially in Spain btu also Greece and italy easily have some day above 35. It also happens couple of days per year were I live, a lot more north than any of those.Reply

Do you really get 35°C, in the shade, outside, for more than 260 hours a year? I'm sure it happens for a few hours a day in the two hottest months, but the map does cap out at 8500 out of 8760 hours.Reply

What about wear&tear at running the equipment at hotter temperatures? I remember seeing the chart where higher temperature = shorter life span. I would imagine the OEM's have engineered a bit over this and warranties aside, it should be basic physics?Reply

Actually, the IT equipment (servers & networking) use more power than the cooling equipment.ref: http://www.electronics-cooling.com/2010/12/energy-..."The IT equipment usually consumes about 45-55% of the total electricity, and total cooling energy consumption is roughly 30-40% of the total energy use"

That is the whole point, isn't it? IT equipment uses power to be productive, everything else is supporting the IT equipment and thus overhead that you have to minimize. From the facility power, CRACs are the most important power gobblers. Reply

On the first page you mention "The "single-tenant" data centers of Facebook, Google, Microsoft and Yahoo that use "free cooling" to its full potential are able to achieve an astonishing PUE of 1.15-1."