Posted
by
kdawsonon Saturday April 12, 2008 @06:42PM
from the formerly-hot-hardware dept.

bigwophh writes "Liquid cooling a PC has traditionally been considered an extreme solution, pursued by enthusiasts trying to squeeze every last bit of performance from their systems. In recent years, however, liquid cooling has moved toward the mainstream, as evidenced by the number of manufacturers producing entry-level, all-in-one kits. These kits are usually easy to install and operate, but at the expense of performance. Asetek's aptly named LCLC (Low Cost Liquid Cooling) may resemble other liquid cooling setups, but it offers a number of features that set it apart. For one, the LCLC is a totally sealed system that comes pre-assembled. Secondly, plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues, eliminating the need to refill the system. And to further simplify the LCLC, its pump and water block are integrated into a single unit. Considering its relative simplicity, silence, and low cost, the Asetek LCLC performs quite well, besting traditional air coolers by a large margin in some tests."

Wouldn't "is a totally sealed system" take care of "evaporation issues, eliminating the need to refill the system" without requiring "plastic tubing and a non-toxic, non-flammable liquid"????
I'm just saying....

If you had RTFA, you would've found that making a sealed system apparently isn't enough by itself. The silicone tubing used in most liquid-cooling rigs apparently is somewhat permeable, so water can seep through it and evaporate. Replacing silicone with vinyl fixes that, at the expense of slightly increased rigidity.

This is very true. I just recently disassembled my system in favor of a Core2 Duo machine. I built the rig because my 1st gen P4 3.6GHz was a pain to air-cool efficiently. I noticed that about a year after assembling the system that the temps climbed rapidly moments after power up. I found that almost all my fluid had gone from the system.

What I thought was fluid was actually UV dye that had permeated the silicone tubing from the cooling solution. Additionally, when I stripped the system, all the tubing

I'm surprised liquid cooling is still seen as a fringe/hobbyist technique, with water (or oil) having a much higher heat capacity than air I would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system (via a heat exchanger of course, I don't really want to drink anything that's passed through a server rack). Does anyone know if this has been tried, and if so why it didn't work?

I believe that Sun's Blackbox uses water cooling for refrigeration between racks, not as a method of cooling the server hardware directly. Like the sibling poster says, too much risk of leakage near the electrical bits. With however many gallons/sec Blackbox requires though, you can turn a lot of hot air back into cold air and just move it around in a circle.

i would have thought liquid cooling would make sense for datacentres - instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

There are a few things that come to mind:

- A datacenter might have different clients renting a cage, owning their own servers you can't enforce the use of watercooling. AC will have to be present and running in any case.

- Water + electricity is a risk. With tight SLA's, you don't want to fry your server with your extra investments in its redundant failover hardware altogether.

- Available server hardware isn't typically watercooled. Who's going to convince the client hacking a watercooled system on your most critical hardware is a good decision? For defects, a support contract with the hardware vendor is typical. If you mod it, soak it, you're out of warranty and can't fall back on your external SLA.

- electricity "bills" aren't an issue, you have so much amps you can run on each cage if you rent you keep under it or you'll have to rent another cage (notice an advantage for the datacenter here?) It's always part of the calculated cost, it's a non-issue really for datacenters or for you when you want to rent a part of the datacenter.

I don't know where you are hosting where "electricity bills" don't matter.I have systems hosted in 3 different DCs, 3 different companies. All of them raised their rates in the last year by 20-30% in one way or another. One DC includes the electricity in your flat monthly bill, the only incremental charge in that DC is bandwidth (IE you get 100GB of transfer, if you go over its some dollars per GB), they raised their flat rate 20%, citing higher electricity costs.

Y'all are basically idiots.I just came from NASA Ames research center, (Talk about heavy supercomputing!) and they are heavily water-cooled. Right now they have coolers on each of the processor blocks, and radiators on the backs of the cabinets, but are quickly moving to directly chilling the water.They use quality hoses and fittings, no leakage.The efficiency is so much higher than air, and it makes the operating environment much nicer. (They have people in there regularly swapping out drives tapes, whatev

I actually have a rack of watercooled equipment sitting in a datacenter.air-cooling was not an option because the air-cooling system was maxed-out for that floor, whilst there was plenty of floorspace left.(blame it on the silly cooling requirements of bladeservers)

-Free (both source and disposal)-Non-conductive-Non-corrosive-Lightweight-Will not undergo phase change under typical or emergency server conditions (think water>steam)-Cooling air does not need to be kept separate from breathing air, unlike water, which must be kept completely separate from potable water

I don't think water/oil cooling is ready for mainstream data farm applications quite yet. I also think that future processors will use technology that isn't nearly as hot and wasteful as what we use now, making water cooling a moot point.

Air is one of the most corrosive substances there is. Specifically, the oxygen in the air is. It just takes time. Normally, a server won't be in operation long enough for this kind of corrosion to happen, especially if it uses gold-plated contacts, but it will happen.

Air is less corrosive. But depending on the liquid that's in use in a liquid cooling rig, it usually isn't corrosive or dangerous to a computer anyway. Liquid cooling rigs are usually an oil such as mineral oil or an alcohol like propanol, neither of which is particularly harmful to electronics.

Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

Finally... if your server is running hot enough that mineral oil is boiling off, you've got more serious things to worry about than that. (its boiling point varies, based on the grade, between 260-330'C -- http://www.jtbaker.com/msds/englishhtml/M7700.htm [jtbaker.com] )

Also... while it's a technicality, air *is* conductive. It just has a very high impedance. It *will* conduct electricity, and I'm pretty near certain you've seen it happen: it's called lightening.

If you want to get all technical about it, you're basically wrong. The resistivity of air is exceedingly high. However, like all insulators, it has a breakdown strength, and at electric field strengths beyond that, the conduction mode changes. It's not simply a very high value resistor -- nonconducting air and conducting air are two very different states, which is the reason lightning happens. The air doesn't conduct, allowing the charge to build higher and higher, until the field is strong enough that breakdown begins.

For materials with resistivity as high as air in its normal state, it's not reasonable to call them conducting except under the most extreme conditions. Typical resistance values for air paths found in computers would be on the order of petaohms. While there is some sense in which a petaohm resistor conducts, the cases where that is relevant are so vanishingly rare that it is far more productive to the discussion to simply say it doesn't conduct.

This is one of those cases. Claiming that air is conductive is detrimental to the discussion at best.

I'm sure there's some engineering to be done to solve that problem for servers. You could make copper piping through the entire solution (this IS a server, no expense is spared, no need for flimsy tubing) that would be good for 70+ years. You don't keep servers operating 70 years typically. Well, at least *I* don't. Some of you guys might just to brag about how much uptime your Linux box has to your great grandchildren.

If you live in a desert climate, air cooling sucks and if the dust is not dealt with at regular intervals things fail quickly. First dust starts to accumulate on the fan blades (unevenly) putting it out of balance thus placing greater strain on its barrings. Meanwhile, intel's ingenious design of their retail cooling fan and heatsink ends up being clogged with dust. The ambient temperature inside the chassis begins to increase as the chassis fan and PSU fans have now ceased, leaving only the higher power cp

A DC might have 20,000 servers. That heat has to go SOMEWHERE. If it's pumped into the ambient air just like an aircooled machine, you're still going to need large AC units to move that hot air out of the DC

With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity? I'm familiar with a local facility which operates its air conditioning systems on steam, though I forget the name of the technology at the moment.

With the caveat that thermodynamics scares and confuses me, if you have a bunch of heat coming out of the servers' water-coolers, couldn't you pipe that into a heat pump and recover some cooling energy or even electricity?

Yes. Now, THAT would be smart. Eliminate the cost of water heaters, augment winter HVAC bills, etc. Steam power plants use "waste" energy, the heat left over in the water after it runs the main turbines, to preheat the water going into the boiler. There's usually heat left over after THAT, and it is at a good temp for use in the power plant building itself. Any heat sent back out to the environment is wasted, and wasted energy = wasted $$.

Yes I know what those are. But carrying the heat away from the servers and venting it to the room isn't going to help the overall need to cool everything. It may make each server slightly cooler but it's not going to alleviate the need for power or cooling in any data center overall.

considering most CPU's can run at 90C but most air cooled systems will recommend something below 23C ambient. Efficiency of a cooling system maintaining 60C would be much greater, which with a much higher conductivity should be fine hence the cost savings of liquid cooling.

instead of huge electricity bills for A/C you could just plumb each rack into the building's water system

Or you could use magical pixie dust...

The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat. It does not magically "cool" anything. Unless ambient temperatures are always much lower than you want your datacenter to be, you'll still be running the water through an A/C. And if you're lucky enough to be someplace that ambient temperatures are always that lo

The ONLY THING water cooling does is (potentially) provide a larger surface area to disperse the heat.

So totally wrong/ignorant... Is this a troll? Water cooling does a lot more than that.

1. Can be a LOT quieter than normal air cooling.2. Allows for heat removal with a much smaller heat exchange unit on the heat source.3. Allows for heat transfer to a location less affected be the excess heat being dumped (such as outside a case) instead of just dumping the heat in the immediate vicinity of either the item being cooled or near other components affected by heat.

There are other reasons, but these alone are more than enough. Did you not know these, or were you just trolling?

1. Is a result of the larger heat exchange area. And makes no difference in a data center.2. No benefit for any practical application. Definitely makes no difference in a data center.3. Does not affect the cooling costs of a data center in the slightest.

Nothing about water cooling will reduce the cooling and energy costs of a data center IN THE SLIGHTEST. You're doing a lot of magical thinking, with NO experience in the subject.

My experience is with datacenters that are apparently not as generic as the ones you seem to be claiming experience with. Just because you do things one way and maybe always will, does not mean that another customer will have requirements that your cookie-cutter approach can satisfy.Put a datacenter 300 ft underground, and see how far simple air cooling gets you. In that case, there MUST be a way to dump the heat that doesn't involve simply blowing air around. If it works for you, that's fine. But attem

You keep going on about experience, when it's obvious you're the one with the limited experience since you have not seen any actual applications that require any more thought than "stick some more fans on it and it'll be ok" or "well, just put another AC exchanger on the roof and it'll probably work fine". You still have to get the heat out. And that's the whole point about water cooling. Getting the heat out without relying on whooshing air around (whether it's air blowing on heatsinks or air in an clim

Liquid cooling can affect the energy costs in a big way depending on how well integrated the system is. As an example, CoolIT systems had developed a server rack with an integrated liquid cooling system that they had shown off at CES this year. The rack essentially used hydraulic fittings to allow you to hot-swap systems from the chassis, while still keeping the cooling centralized.

They had essentially used the radiator from a Honda Accord, which they found to be able to dissipate between 25 and 35 KW o

With a system like this centralizing the area where heat is dumped, fluids can be piped out to a radiator sitting outside, so essentially, a large portion of the heat produced from a rack of computers, can be relocated outside of the data center.

You could similarly open up a data center, with just large fans blowing ambient air in and out.

With either method, it just wouldn't work. A $50,000 server rack is not your home PC. It's not

The biggest issue with running a datacenter on 40C ambient air with big fans blowing it in and out the doors is that air cooling is so inefficient that the cooled components would overheat as they pick up so little temp drop from AIR. 40C WATER cooling on the other hand would bring those CPU and HDD temps down a good bit.

You're failing to understand just how much better water transfers heat vs air.

This is actually pretty amusing as when I setup my home office I designed much the same thing! Sadly I didn't put it into place but indeed it would have worked quite well I'm sure. Radiator in the crawlspace, temp sensed electric cooling fan mounted on radiator - Ford Taurus fan most likely. Copper or PVC piping up through the floor into the office in a loop with a shutoff valve in the middle to regulate bypass flow. An agro pump to move the liquid or perhaps a small pool pump. Fittings on the pipe mounted

As someone who makes their living figuring out how to move heat from A to B (in avionics, not datacenters), this comment makes my head hurt for a number of reasons... First off, as others have pointed out, liquid cooling in data centers is a reality and folks like IBM have worked on liquid cooilng for decades. Due to many of the reasons already mentioned, everyone avoids liquid cooling as long as they can and a number of technologies have helped on this. For example, the transition from Bipolar to CMOS a

Ok, you've got a brain, so you should also know this:Given a water cooled rig and an air cooled rig which operate at the same efficiency (in terms of Watts dissipated per Watt of cooling power), water cooling and air cooling perform just about identically as long as things remain inside of the case.

Move the water cooling system's radiator outside of the case, and things start to slant toward water cooling.

Observations:

1. They're equal in cooling capacity, but the air-cooled system is simpler and has fewer

What would change is that you would actually have a better understanding as to what you were talking about vs simply making broad assumptions with ZERO experience. You start by assuming that air and water systems work at the same efficiency and that mounting the radiator inside the case somehow puts them there. This is not true, had you actually worked with any of this you would know how well it works even with a radiator in the case.But hey, all of the various big guys moving to water to cool their large s

My statements are true. If you do not understand them, then please ask for clarification. If you'd like to refute them, feel free to use your anecdotal evidence to do so.But all you're doing is waving your hands around and talking about "various big boys," as if mere the notion of it growing in popularity obviously means that it is better, while insinuating that the total concept of dissipating waste heat into air is impossible to grasp without actually experiencing a water cooling rig first.

You could run the hot water for the building through a heat exchanger before you heat it up with a boiler.
Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water
Over all the energy used to go from Cold Water -> Warm Water is saved.

Not only that but if you attach a Maxwell's Demon to the output you can get cooled water separated out from the hot and then feed that back in to the cooler while sending the separated hot water to the boiler!!!

You could run the hot water for the building through a heat exchanger before you heat it up with a boiler. Cold Water -> Heat Exchanger -> Warm Water -> Boiler -> Hot Water Over all the energy used to go from Cold Water -> Warm Water is saved.

Unless your datacenter is collocating with a (large) laundromat, there just isn't that much demand for hot water at a datacenter. No laundry, no showers, little to no cooking.

Given that they use AC because they cant be bothered to organise a proper air cooling system (pumping the hot air out of the back of server, instead of cooling all the air in the room, etc), its simply because its cheaper to use AC than actually organise anything.

You do have a good point though, use of a non conductive oil, that was cooled against water pipes, would mean the servers are just as safe as they are at the moment.

If a server came ready-built with fail-safe plumbing and cooling mechanisms, the answer would be yes. Water, oil, flourinert - these would all be excellent. Total immersive cooling would be more logical than piped cooling, as there are fewer parts that could fail and less possible damage from a failure. You could have a completely sealed compute unit that contained everything and was ready to go, so eliminating any special skills on the part of the data centre or any special requirements in the way of plumb

This is a great bonus for high density HPC applications. Typically in a datacenter you are blowing air up from the raised floor in front of the servers. However, a good deal of it is taken up by the servers in the lower part of the rack, leaving the top servers running warmer than the lower servers. Supposedly the water chilled doors hel

Liquid cooling hit the mainstream, mainframes in fact, 'way back in the early 1980s with the IBM 3090 mainframe. What we now call a water block, IBM called the Thermal Conduction Module. According to an article in Scientific American at the time, it combined a water block and chip packaging. The metal of the TCM directly contacted the chip substrates.

I too have often wondered why more datacenters don't use water. (aside from the fact that you'd have to build the place from the ground up for it, probably)I think a standardized interface could help immensely:- each rack-mount server has a cold water input on one side, and a hot water output on the other.- the rack has a cold water rail up one side and a hot return down the other.- under the floor, each rack is plugged into hot and cold "bus" pipes, which feed into one of those industrial waterfall coolers

Yes, because they are consumer grade solutions, not enterprise grade. You can engineer a solution that uses a 100% solid copper piping/waterblock framework with absolutely no joints or gaps anywhere in the actual box. Design the thing so the weak points, or any point where water could possibly leak is on the outside of the system (obviously not above or below, might need a custom server cabinet for this) and you can have cooling solutions that will not harm server hardware due to leaking.I'm not saying my i

Even the article tries hard to tout its benefits but their own stats show its not worth it. Either it's a crappy implementation or its simply not relevant.

How so? They show that it's quieter and more effective than stock cooling, and significantly quieter than an aftermarket air cooling solution. What exactly are you looking for then? You gotta be more specific than just a completely unsupported criticism that doesn't even reflect the test results, let alone explain your personal criteria.

You don't have to be an entirely patronizing asshole. But I'm guessing you don't work in sales.

Ok so it's marginally quieter. As for its absolute cooling power it's on par with whatever air cooled unit you can get today with a lot less complexity. All in all that's pretty weak justification. If that's your definition of "it works well" then you clearly care about noise above all other criteria. There are probably better ways to make your PC quieter than this.

But you are still arguing from a position lacking in factual information. Water cooling can be almost completely silent, and can remain so even when cooling hardware that would otherwise require very loud fans for conventional air cooling.

This does not even address the additional cooling requirements seen in overclocking, small form factor, or otherwise special use equipment. A water cooled HTPC for example typically has to trade off performance for noise, as hig

So it's just quieter. Again, if that's your main concern, then fine. There are probably less problematic ways to address that. I for one would not want to ever have to worry about hundreds of little water cooling systems in a data center each with the potential to break down and cause a catastrophic failure among many machines.Back in the old TCM days you might have a dozen CEM complexes each with a TCM ganged into a single chiller pump system. That's a kind of failure rate that's manageable. But with 5000

In a datacenter using blade servers, I'd expect some sort of hybrid heat exchange system would be more useful than pure water cooling. I strongly disagree that the only benefit is lower noise, but also remember that we're not just talking about datacenter applications either. All sorts of applications (such as the htpc setup I described) could get not only lower noise but also allow higher performance due to the better managed thermal load.And that is all water cooling does - allow a better and more manag

plastic tubing and a non-toxic, non-flammable liquid are used to overcome evaporation issues,

If it is truly sealed, there should never be any "evaporation issues," as there is no where for it to evaporate to. Being non-toxic, non-flammable has nothing to do with it, I can think of another very common non-flammable non-toxic (in most of its forms and uses) compound thats readily available but is NOT used specifically because it tends to boil at relatively low temps and low pressures: Dihydrogen monoxide. As for plastic tubing, what else are you going to make it from? Metal? You could, but most sys

The article says that most water cooling systems use silicone tubing, which the author seems to think is not a plastic. I'm not an expert on plastics, but PVC seems like a poor choice to me. It's too likely to degrade over a decade or so and become brittle or fragile.

The clear plastic "bling" tubing is often medical grade and guess what? Over time liquid EVAPS from such systems. How? It actually manages to be absorbed by the tubing and slowly dissipate into the air, which is why this system uses a different kind of tubing and why they highlighted it's lack of evaporation issues. You haven't run a liquid cooled rig have you?Oh and plain old water is a BAD idea in a liquid cooled PC. For one it tends to oxidize things like copper heatsinks over time and for another you ge

It doesn't seem much different to the gigabyte kit i put in my computer 2 years ago http://www.cluboc.net/reviews/super_cooling/gigabyte/galaxy/index.htm [cluboc.net] the only difference being the pre built bit which could cause great difficulty if you want to do something sensible like mount the radiator on the outside.
(Note: soon after i got mine they released a second version with a different pump and reservoir, and i can tell why, after 13 months, just out of waranty, my reservoir cracked)

This is kind of inevitable, and IMHO overdue. Monolithic heat sinks and fans the size of jet engine intakes have been a pain in the arse for top of the range gaming machines for years. Also, I don't know about anyone else, but the air cooling of my computer is a depressingly efficient mechanism for sucking dust and fluff into the computer and keeping it there.

Intel and AMD systems are also using heat pipes just like the Shuttle XPCs and have been for a year or three now. All of the best "air" heatsinks I am aware of use some liquid in them in the same fashion. Shuttle just managed to build it such that the radiator was a little further divorced from the heatsource is all.

The important difference here is that the heat exchanger in shuttle pcs uses heat pipes which thru the use of a pressurized fluid, utilize phase change to transport heat.
Phase change systems include heat pipes and those using compressors. Liquid cooling with pcs includes pumps, tubing/piping, and a liquid - the fluid never changes to a gas. In this way, your point does not apply to the liquid cooling topic.

I have the black Al faced one for longer PSs. It was extremely easy to set the water cooling up, and has kept my machine cool even with two extra blocks for the SLI cards and a chipset cooler. Yes it's not sealed, but then again, is that

I can't help but think that this is a stop-gap measure. I used to read up about all the various methods of silencing a computer (with the intention to implement myself) but for consumer-grade applications I'd prefer to wait for a variant of Moore's Law to do its work - the propensity for performance per watt to keep increasing until it nears whatever limits are predicted by information theory.At that stage there will be an option to cool with no moving parts for typical desktop/laptop applications, and it w

Anybody actually find where you can buy this system? The article only says that they found one and the price for a minimal setup, but not where. I'm upgrading soon and this could be a good addition to some new hardware.
Googling for "asetek liquid cooling system" only finds pages of news articles:(

Unfortunately, Asetek does not sell to the retail market, and has no plans to ever sell to the retail market. This means that even if you do manage to get ahold of one, you will get zero support from the manufacturer if anything goes wrong. Not something I'd risk.

when some standards have been defined and actually used.
I'm sure one day we'll have an 'ATX+' power supply. As well as the plethora of wires hanging out the back of it, we'll have some loops of tubing with heat-exchangers on the end. Maybe standard ones for chipset, CPU and a couple of GPU ones. Buy a new graphics card and just snap on the right heatsink.
It's never going to take off until the systems are all sealed (My mum is not going to buy a Dell with a bottle of 'UV Reactive' magic solvent).
Sealed sy

Swiftech makes a system you might be interested in that's also self contained. The pump sits right on top of the CPU and the heat exchanger fits where your 120mm exhaust fan is normally mounted. I'm not using it and would only consider it if I were cooling my vid card too but a friend is using it and REALLY likes it - trouble free install on his box.