Computer Clusters That Heat Houses

Computer Clusters That Heat Houses

In contrast, because water is many times more efficient at capturing heat than air, water cooling can deliver much higher temperatures, says Michel. Water was once commonly used to cool mainframe computers, but this merely consisted of piping cold water through server cabinets to cool the air near the racks.

By some estimates, information technology infrastructure is responsible for as much as 2 percent of global carbon emissions, putting it on a par with aviation. And some experts say that this figure is set to double in the next five years.

“It’s more efficient to heat water and move it somewhere else than it is with air,” says Jonathan Koomey, a project scientist at Lawrence Berkeley National Laboratories and a consulting professor at Stanford University. In 2005, data centers were responsible for 1 percent of global electricity–a doubling of 2000 levels, Koomey says. But he’s not convinced that the figure will continue to grow. “There are many ways to improve the efficiency of data centers,” he says. For example, better management of computer centers can improve efficiencies dramatically. “We have servers that on average are running at 5 to 15 percent of their maximum load,” Koomey says. “Even if the server is doing nothing, it’s still using 60 to 70 percent of its power.”

Brand also notes that “air is a much cheaper way to do the cooling” and that modern data centers consume far less energy than do their older counterparts for cooling.

The trend toward stacking processors on top of each other to increase their power density is another reason why IBM is pursuing this sort of microfluidic water cooling, says Michel. Such three-dimensional chips will pose serious problems for traditional air-based cooling systems, he says.