New server cooling technology deployed in pilot program at Calit2

September 20, 2012

Closeup of the back of the server rack with Cool-Flo installed. The coolant flows through the tubes into the server and carries the heat out like a circulatory system.

(Phys.org)—The California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego has become the inaugural test site for a new approach to cooling computer servers – a technology that could improve energy efficiency and enable higher-performance computing.

The Cool-Flo technology was developed by Steve Harrington, CEO of Flometrics, Inc., and an expert in fluid dynamics and thermodynamics. Harrington earned his Ph.D. in applied mechanics at UC San Diego, and is a part-time instructor in UCSD's Mechanical and Aerospace Engineering department, where he teaches a senior design course in aerospace engineering in which undergraduate students instrument, build and fly hybrid rockets. The technology was installed in the Calit2 server room as part of ongoing research led by Computer Science and Engineering professor Tajana Simunic Rosing into energy efficient data centers.

Flometrics installed the new Chilldyne Cool-Flo negative pressure liquid cooling system as a pilot program to cool three 1U servers in the Calit2's server room in Atkinson Hall on the UC San Diego campus. The servers monitor processor, RAM and chipset temperatures, as well as the total power consumed by the servers, while running typical research loads.

The technology is a server-agnostic, rack-based, direct-to-the-chip and leak-free liquid cooling system that can be used to cool any server. It is based on rocket-cooling technology and utilizes a pump developed with a grant from NASA's Small Business Innovation Research program.

Cool-Flo pump, with simple stop/start interface, is conveniently located uder the server rack for easy access.

Cool-Flo also utilizes a proprietary no-drip, hot swap connector enabling technicians to remove servers while the system continues to cool the remaining servers. The technology also uses negative-pressure coolant flow that eliminates leaks.

This video is a demonstration of our fail safe system in case the tubing breaks or gets cut. Watch as the liquid is safely and cleanly withdrawn so that no electronics are damaged from stray drips.

"Not only is there an advantage of power reduction by 25 to 35 percent, but you are lowering existing CPU temperatures by 30 degrees Celsius, resulting in practically unlimited density," explained Harrington. "Cool-Flo is a good fit for Calit2's server needs given the institute's commitment to reducing the energy intensity of campus IT and improving energy efficiency."

The system's reduced power needs are derived from the elimination of HVAC, reduced leakage of electrical current from CPUs, and reduced power needed to run servers' fans.

This video is not supported by your browser at this time.

The Cool-Flo system has a method for just disconnecting one server from the cooling for quick "swapping" or disconnecting of servers. The system prevents any dripping and is as easy as unplugging a computer.

According to Harrington, the system reduces the power usage effectiveness (PUE) of data centers from 1.6 to 1.1 or less. The Green Grid Consortium's 'ideal' PUE, a measure of total facility power divided by IT equipment power, is 1.0.

"We are pleased to host this novel approach to computer cooling, which enables the use of liquids (water with a rust inhibitor, or perfluorocarbon) for improved heat exchange, while minimizing the risk of leaks in the data center," said Calit2's technology infrastructure manager, Tad Reynales.

Related Stories

IBM today introduced "Cool Blue," a technology component that can use the existing chilled water supply for air conditioning systems already located in the majority of customer datacenters to reduce server heat emissions ...

Evaluating a new server for your data center is no longer simply a matter of measuring raw performance. With today's increasing demands, you also need to consider how much power, air conditioning and space a server consumes. ...

Approximately a third of the electricity consumed by large data centers doesn't power the computer servers that conduct online transactions, serve Web pages or store information. Instead, that electricity must be used for ...

A new report commissioned by the New York Times, shows that electricity consumption used by data centers in the United States and around the world grew at a much slower pace then was predicted by a U.S. Environmental Protection ...

Fujitsu Laboratories today announced the development of a power conservation system control technology that reduces overall power consumption in container data centers by closely coordinating the operation of servers and ...

(Phys.org)—Intel just finished a yearlong test of Green Revolution Cooling's mineral-oil server-immersion technology. Intel has tried immersing servers in the company's oil formulation to keep the servers cool and they ...

Recommended for you

It sounds like a science-fiction nightmare. But "killer robots" have the likes of British scientist Stephen Hawking and Apple co-founder Steve Wozniak fretting, and warning they could fuel ethnic cleansing and an arms race.

A startup team calls their work a product. They also call it a social movement. Many people in the over-7,000 islands in the Philippines lack access to electricity .The startup would like to make a difference. Their main ...

Are some people fed up with remembering and using passwords and PINs to make it though the day? Those who have had enough would prefer to do without them. For mobile tasks that involve banking, though, it is obvious that ...

Talk about reinventing the wheel, capacitors, transformers and induction furnace coils have been liquid cooled for decades. The whole problem of disconecting the water lines could be avoided by mounting a thermaly conductive block on the CPU that makes physical contact with a liquid cooled surface when the server is locked in place. So much for a Ph.D.

About time. If there are two applications where water-cooling a computer is appropriate, compact all-in-one desktops are one; server racks are the other. I guess I can understand them not wanting to do water cooling with servers the traditional way. I'm glad they've figured out something that works with their requirements for risk and serviceability.

Please sign in to add a comment.
Registration is free, and takes less than a minute.
Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.