Immersion Cooling Floated as Green Energy Solution

By Tiffany Trader

February 14, 2014

At the same time as Moore’s Law progress is slowing down, the demand for processing data is at an all-time high. A full 90 percent of all the data in the world was created in the last two years, and datacenters are responsible for 2 percent of overall US electrical usage. Keeping all those warehouse-scale server farms and supercomputers at their optimal operating temperature is costly from an economic and an environmental perspective. The challenge has some big iron operators giving their computers a bath, as highlighted by a recent piece in The New York Times.

Japan has been especially motivated by energy concerns. The country has been operating with a reduced electricity supply ever since the March 2011 earthquake and tsunami and subsequent meltdown at the Fukushima Daiichi power plant. As a protective measure, the country limited the output of its other nuclear facilities, which curtailed electricity supplies.

With energy budgets constrained, organizations with major electricity needs had to get creative. Such was the case at the Tokyo Institute of Technology. The university administration had capped power supplies, but the institute required more supercomputing capacity. Thus the idea for the oil-cooled Tsubame KFC was born. The supercomputer employs a mineral oil-based cooling solution, called CarnotJet, developed by Austin, Texas-based Green Revolution Cooling. Last November Tsubame KFC was declared by the Green500 list as the most energy-efficient machine of its kind. The system is 50 percent more powerful than an older machine with the same energy footprint.

Other companies involved in the space include Iceotope, a start-up based in Sheffield, England, which is promoting the use of liquid fluoroplastic, rather than oil. Another vendor, the Hong Kong-based Allied Control, used 3M’s “passive two-phase liquid immersion cooling system” for a 500kW datacenter that mines for Bitcoins.

Liquid cooling is not new. Compared to air, liquids are more efficient coolants because they are better thermal conductors and have a higher heat capacity. In fact, liquids are estimated to be about 4,000 times more effective at removing heat than air.

Iconic American supercomputer maker Cray used submersion liquid cooling back in the 1980s, but the approach never fully caught on because of cost concerns and also the coolants of that era were known ozone-depleters. A combination of air conditioning and pipe-based water cooling worked “good enough” and reduced the cost and complexity of the datacenter build.

Now dimensions have aligned to make immersive cooling attractive again. Supercomputer centers can easily spend tens of millions of dollars a year on energy bills. For the biggest corporate datacenters, the cost is even higher, running to hundreds of millions a year. In both cases, it’s not uncommon for 50 percent of that bill to go to energy and cooling costs.

At this time, immersive cooling has made it into more supercomputers than datacenters, according to Christiaan Best, chief executive of Green Revolution, who suggests that corporations are more risk averse than the academic crowd, which tend to be more open to experimentation.

“You can imagine, if we walk in and say, ‘Why don’t you take your data center and put it in oil,’ you have to have something pretty solid to point to,” Mr. Best said.

And yet Green Revolution’s method has been part of several datacenter deployments, including a United States Department of Defense facility. A one-year study of the system performed by Intel found that the servers suffered no ill effects, and power consumption dropped considerably.

MapD Technologies, the GPU-accelerated database specialist, said it is working with university researchers on leveraging graphics processors to advance geospatial analytics.
The San Francisco-based company is collabor Read more…

By George Leopold

A collaboration between the Department of Energy’s National Energy Research Scientific Computing Center (NERSC), Intel and five Intel Parallel Computing Centers (IPCCs) has resulted in a new Big Data Center (BDC) that Read more…

By Linda Barney

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last week the cloud giant released deeplearn.js as part of that in Read more…

Can’t wait to see next week’s solar eclipse? You can at least catch glimpses of what scientists expect it will look like. A team from Predictive Science Inc. (PSI), based in San Diego, working with Stampede2 at the Read more…

By John Russell

Microsoft has acquired cloud computing software vendor Cycle Computing in a move designed to bring orchestration tools along with high-end computing access capabilities to the cloud. Terms of the acquisition were not disclosed. Read more…

By Tiffany Trader

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Alex Woodie

In this contributed perspective piece, Intel’s Jim Jeffers makes the case that CPU-based visualization is now widely adopted and as such is no longer a contrarian view, but is rather an exascale requirement. Read more…

By John Russell

On Thursday, Google announced that MIT math professor and computational number theorist Andrew V. Sutherland had set a record for the largest Google Compute Engine (GCE) job. Sutherland ran the massive mathematics workload on 220,000 GCE cores using preemptible virtual machine instances. Read more…