Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

1sockchuck writes "Some data centers are kept as chilly as meat lockers. But IT operations in colder regions face challenges in managing conditions — hence Facebook's to use environmentally controlled trucks to make deliveries to its new data center in Sweden, which is located on the edge of the Arctic Circle. The problem is the temperature change in transporting gear. 'A rapid rate of change (in temperature) can create condensation on the electronics, and that's no good,' said Facebook's Frank Frankovsky."

This isn't anything new, anytime you take something from the extreme cold and bring it inside you risk condensation. This is usually dealt with by simply letting something sit at room temperature for several hours before powering it on.

In the middle of January if you take a freezing cold delivery and power it on right away and fry your new (XXXXXX) you deserve to void your warranty. There is no excuse for stupidity. Why is this on slashdot as news?

That's not the point. Every time you move something from a cold place in to a warmer one (higher humidity in the air implicit, since higher temperature means higher point of saturation) condensation occurs, as the air near the cold item cools down and "drops" dew on the cold surface. If the latter is intransparent, like server rails, backplanes or transformator cores/coils the condensed water will collect there.
Basically, condensation always occurs where temperature difference exist and it always happens at the coldest surface in the room. (Hence all the trouble with moisture and poorly insulated walls in colder regions.)
Now a truckload of servers is basically one large thermal buffer. Move it from arctic cold -supposed the machinery had time to adapt to outside temperatures- to room temperature and you will find a lot of water condensing. We're talking about tons of material -with a lot of surface- that will take hours to warm up.

Every time you move something from a cold place in to a warmer one (higher humidity in the air implicit, since higher temperature means higher point of saturation)

Actually that is not implicit. Up here in the frozen wastes of central Alberta in the winter the indoor humidity drops to incredibly low values of 10-20% because there is no moisture in the outside air because it is at -40C and even then has low humidity. This means that condensation is never really a problem - you might get a bit of it but it very quickly evaporates because of the incredibly low humidity inside. In fact the humidity gets so low that our data centre has a humidifier to bump it up to the safe operating range of machines.

Conversely in the UK where there is no extreme cold weather (yes I know the beeb goes nuts if London drops below -5C but sorry, that doesn't count!) but lots of humidity. As a kid I used to have far more problems with my glasses fogging up when I came inside during the winter that I do in Canada.

Up here in the frozen wastes of central Alberta in the winter the indoor humidity drops to incredibly low values of 10-20% because there is no moisture in the outside air because it is at -40C and even then has low humidity. This means that condensation is never really a problem

Static electricity on the other hand... Seriously, every had a drink from the water cooler when it's that cold outside? Be sure to touch the water with your finger first, or you will discover first hand that the tooth is possibly the worst place on the body to experience a static shock. And if you think your tooth is sensitive, imagine how the electronics feel.

Here in South Dakota, I have a problem with ice forming on my glasses after coming indoors. It doesn't last long, because the indoor air warms the material fairly quickly and the low humidity causes it to evaporate fairly fast.

I get that occasionally but not often because of the low humidity. However when it hit -44C one year (below -50C with wind chill) and I was out walking the dog I did notice my glasses start to make strange sounds and I was a little worried that the metal frames were contracting and putting strain on the glass lens!

Oh I thought of another one. The problem has nothing to do with temperature. The problem is when the indoor/outdoor/dew points intersect which happens all the time, not just when its cold.

One excruciatingly humid summer day I was hauling around a protocol analyzer worth about as much as my car, and it cold soaked in front of the car air conditioner duct cooling itself to 40 degrees or whatever the AC output is, then it was dripping condensed water as I carried it into the customer premises, an un-airconditioned factory floor. So I'm sitting there doing nothing and explaining to the customer how I have to do nothing, until the test set dries off because its too cold (customer VP looks out window at blue sky 110 degree day). Yes that was an unpleasant meeting.

Have you ever seen a glass of ice water that showed condensation? Bingo, you should be able to figure out who powering on freezing cold electronics is a bad idea. I'm pretty sure everyone working at a data center is familiar with the idea of condensation, certainly anyone working in any situation where it would be an actual issue.

I can certainly understand that people from warmer climates won't understand how to drive during an ice storm or how to recover from a skid. These are things that come from experience with exposure to a certain climate.

I will answer your question as I wasn't trolling. I think everyone should know consider this obvious because condensation is elementary physics. When you consider that I am on a technology site with a notable science influence it's the kind of thing I just expect that people would know.

Ok, my question was badly worded. Everyone knows about condensation. But then why would you expect most of us to think about it before powering a machine? I mean, unless we've tripped on that one once or twice?

Ok, my question was badly worded. Everyone knows about condensation. But then why would you expect most of us to think about it before powering a machine? I mean, unless we've tripped on that one once or twice?

Logic.

- It's common knowledge that powering on wet electronics is a great way to let out the magic smoke, right? Right.

Therefore, knowing that electronics which are brought from a cold environment into a warm environment are going to form condensation, and that moisture is bad for powered electronics, the logical assumption is that plugging them in whilst still cover

not everyone lives in that type of climate, i havent RTFA, but i assume there are some more challenges that people dont usually think about. seems like it would be interesting if your into data centers.

well, it's not particularly interesting article I'm afraid, the interesting part is that they mix incoming fresh air with already circulated heated air instead of having isolated heat exchanger arrangement, other than that it seems like a fairly traditional datacenter - no ssd devices dipped in epoxy sitting outside in ice or crazy stuff like that.

"woot heated trucks".. well duh, not everything likes to be frozen. ever had partially frozen milk on school lunch? it sucks and we walked to school. both ways. or bicycled(on ice). or used kicksleds(if there wasn't sand on the route).

Living slightly to the east, yet just as cold in winter, the strategy is to leave the gear sealed in the box while you prep the racks and wiring and gather tools. Its really not that complicated.You don't have to wait until the gear reaches room temperature, merely gets above the interior air dew point, which I assure you is very low in the winter.

It never works that well. I've seen the logs/graphs. Dew point skirts the bottom of minimal acceptable in winter (around 40 F, from memory), and skirts the top of maximum acceptable in summer (around 60 F, from memory). I suppose it depends on your center. I'm thinking specifically of a private couple acre financial services DC in the upper midwest, although the telecom data centers I've been in are about the same.

Even if you do have the ability to artificially raise or lower the humidity that doesn't mean it makes sense to keep it at the same level all year round.

Most electronics is specified for quite a wide range of relative humidity. Usually 5% to 95% or so.

In winter you want low relative humidity to reduce the risk of condensation on stuff brought in from outside (yeah you try to seal stuff and let it warm up before unwrapping but mistakes and emergencies happen). It's also cheap to achieve low relative humidity due to the low outside air temperature (for a given absoloute humidity relative humidity goes down as temperature goes up).

In summer humidity doesn't matter so much since stuff brought in from outside will be warm. It's also likely to be more expensive to achieve low relative humidity since it involves active dehumidification (which is achieved by cooling the air to the point where the water condenses out)

Low RH is bad because you get static buildup. Sure we've got anti-static wax on the floor and all the cages are grounded, but I still don't want to risk frying a computer because I couldn't keep RH in the right range. Also low RH is easy to achieve since CRAC units due it by their nature =) Even in the middle of summer you have to run humidifiers.

Just because the equipment says 5%-95% doesn't mean it's a good idea. That same equipment is probably also rated to 50c+ temperature.We keep all our facilities around 40% rh. Too high promotes condensation and too low promotes static discharge.All decent HVAC units can control temperature by condensing or evaporating moisture.

So you can get a taste of the luxuries you can afford after a 100 billion dollar IPO. Why wait a few hours before powering up you equipment when you can transfer it using expensive, climate-controlled trucks. At Facebook, even lifeless plastic and metal rides in style on the gravy train.

I'm not sure how much more expensive the climate controlled trucks are. I think it is probably trivial.

I used to run a refer unit (yes, refrigeration units in trucks will heat too) and we were always picking up coiled steel and transporting it on the refer unit to control the climate until it reaches it's destination. The difference between out runs and flat beds making the same runs was less then 5 cents per mile to cover the diesel to run the refer units..

RUNNING machines in plastic bags? I'm not sure you understood the problem domain.

You don't understand the solution.

Put the equipment into plastic bags before loading it on the truck (if it's new equipment, it's probably already wrapped in plastic.

Then when you unload it in the warm datacenter, the moisture condenses on the outside of the bag instead of inside your server.

Once the server is up to room temperature, take it out of the bag, rack it and plug it in, and you're good to go. (Note that in warm humid states, you can have the opposite problem - the cold server is taken from the 65 degree datacenter out to the 95 degree and humid outside air and moisture condenses on it. The plastic bag works here too.)

You don't need to go to Sweden to experience cold temperatures - many datacenters throughout the USA experience temperatures cold enough to cause condensation problems for at least part of the year. The plastic also helps protect equipment that's exposed to moisture that condenses in clouds and falls to the ground (i.e. rain) as it's transferred from the truck to the facility. A problem that Facebook will discover once they open their first datacenter in a rainforest and perhaps they can invent some self deploying canopy that shields the equipment from this mysterious moisture from the sky since they don't seem to like the low-tech plastic bag solution.

Yep, we never put the tapes from the delivery truck directly into the tape library during the winter for exactly this reason. The tapes get loaded on the truck early in the morning when it's often in the teens outside, taking them from that to the 80 degree datacenter is bad enough, putting them in the 100 degree environment of the tape library without acclimating them would be foolish.

This is usually dealt with by simply letting something sit at room temperature for several hours before powering it on... There is no excuse for stupidity. Why is this on slashdot as news?

Maybe because what you said is NOT the same as what Facebook is doing? If simply letting the servers warm up gradually at the destination works fine, why are the spending extra money on heated delivery trucks?

In the Marine Corps, I had some cold weather training before a deployment to Norway. We were instructed to leave our rifles outside of our tents. Otherwise, they would accumulate condensation inside the barrels, which would then freeze when you walk outside. Lots of fun stuff like that.

This isn't anything new, anytime you take something from the extreme cold and bring it inside you risk condensation. This is usually dealt with by simply letting something sit at room temperature for several hours before powering it on.

True. But what you're forgetting (generously assuming you knew it in the first place) is that condensation isn't the only issue. Servers are made of a variety of materials - all of which expand and contract with temperature at different rates. Extreme cold can actually phy

Very true. I had a dell repair tech keep my laptop (unauthorized) for repair in his car over the weekend once in a cold snap in the winter. He froze and shattered the screen. I ended up with a new laptop from Dell when all was said and done.

Early in my career I worked for Polaris and used to arrange deliveries of computers to places in the Arctic circle. We took a number of precautions keep the equipment from getting destroyed by the extreme cold. We never kept things in a heated container though and I was

Early in my career I worked for Polaris and used to arrange deliveries of computers to places in the Arctic circle. We took a number of precautions keep the equipment from getting destroyed by the extreme cold. We never kept things in a heated container though and I was shipping computers to places like Nome Alaska. We never shipped anything in a heated truck though.

Nome isn't in the Arctic circle. Call us when you go someplace interesting, like going on a Polaris from Faribanks to Anaktuvuk Pass with a server strapped on the back (no, I haven't. Alaska Air has multiple destinations above the arctic circle, and I've been to all of them, on a plane, I try to not go outside when it's cold).

My favorite was discussing wind chill with regards to electronic equipment, or finding anything rated to work below -40, as most goes to -40, so anything rated for colder requires

We sent equipment to anyplace that had a Polaris dealership. Nome was a particular dealer that came to mind as I had to deal with them more than once. We certainly had dealerships in the Arctic circle as we were the effective equivalent of the local car dealership up there (and Canada).

I'm not arguing the extreme cold is extremely hard on the equipment. Facebook isn't doing the oilfield type of conditions, they are shipping equipment down a highway to a heated data center which is very different from the c

Your not looking very hard. I'll start with a little town called Barrow [google.com] Alaska. I think we can both agree that it is in the Arctic Circle. They also have a Polaris dealership.

Eskimos Inc PolarisPO Box 1273Barrow , AK 99723907-852-8000

If you really want you can look things up directly on Polaris's website [polaris.com]. As I said I worked there, I dealt with the dealerships for a couple of years. They also have dealerships in arctic circle in Canada. They have dealerships that operate under everything from Harley Davidson

They didn't show up on a google search, and on the Polaris website, searching for Barrow gets a result in ME, and Barrow , AK gets no results. And yes, I didn't look too hard. I just gave it a little look and didn't see anything popping up on Google and such, and I did go to Polaris's web site, but the flash version required didn't match what I have installed, though the link you gave went to a page that embedded a Google map frame, which worked better. Closest I've been to the arctic circle in Canada is

Their website is terrible, it always has been. Many dealerships often won't even have "Polaris" in their name which throws things off even more. Small motor dealers often carry multiple lines and it's just the same for Honda or the other manufactures.

I just happened to recall the northern Alaska / Canadian dealerships in particular with all the stuff I had to go through compared to a typical place. For many of those dealerships at the time it was literally the first time they had ever seen a computer. Needl

I replaced a failed tape drive in a warehouse in Alaska, and I pointed out to the shop guy why it failed. There was over one inch of dust collected in the bottom of the case. The silt-like dust/dirt would get into anything and everything.

I'll never "get" the Harley. It's slower and more expensive than almost everything else out there, but then, from what I can tell, most of the people who buy Corvettes are old fat white guys who never even hit the speed limit.

This isn't anything new, anytime you take something from the extreme cold and bring it inside you risk condensation. This is usually dealt with by simply letting something sit at room temperature for several hours before powering it on.

In the middle of January if you take a freezing cold delivery and power it on right away and fry your new (XXXXXX) you deserve to void your warranty. There is no excuse for stupidity. Why is this on slashdot as news?

Not comparable. Bicycles and cars are custom made for the Swedish market (i.e. you have to custom make a lot of parts and replace all nuts and bolts), because the humidity fluctuations (and to a lesser degree temperature fluctuations) make cars and bicycles made for "normal" markets corrode really fast. E.g. a bicycle made for any other European market would only survive a few months use in Sweden, it would corrode the fastest during spring and autumn, or mild winters, in Southern Sweden, when it is not ex

Also, most electronic circuits freeze to death within a few hour by -5C; Luleå stay below that temperature most parts of the year. Any remotely advanced electronic circuits is dead by -20 - -30C, unless isolated from the cold, common temperatures during winter in N. Sweden.

Fairbanks, AK here. We get plenty of -30, and we find that most electronics that survive it just fine. Vehicles are plugged in more to keep the lubricants, coolant fluid within their operating range, and keep the battery a bit warmer so it can start stuff. Even servers can get that cold when not operating; just let them get up to operating temperature before plugging them in.

This isn't anything new, anytime you take something from the extreme cold and bring it inside you risk condensation.

Having worked at a commercial ISP in the Arctic [wired.com] for 3 years, I can tell you that it's a little different from Minnesota. A few hours isn't nearly enough time to let them sit. Our standard was 24-48 hours in the room the equipment was going to occupy before we'd attach cables and power on. Radical heating and cooling also meant that we'd re-seat RAM, NICs and other cards before booting as a matter of course.

And it's not just cold up there, it's also perfectly arid. Things get dry in southern Canada and the n

I'm from South Dakota, but just got back last month from a year's internment in California - you know, where most tech companies and many data centers are. It's got a similar climate to Texas, supposedly, where there is also a lot of tech.:P

It gets down to -20F very normally out here during the winter. With temperatures that low, there is very little moisture in the air indoors, either. So there isn't going to be all that much condensation. The best thing to do is leave the server in the box for several ho

This is nothing--years ago I deployed PCs at Alaskan oilfield installations. Extreme cold makes everything brittle, kept having problems with things like cracked motherboards, just from setting the PC on a desk.

Cold drive bearings don't want to spin up / SMART fail from drive motor overcurrent.Happens to cooling fans too. Fan can't spin so equipment overheats.I've never knowingly had a voice coil bearing seize up, which is interesting because its probably the lowest power actuator in the system yet probably the highest precision / smallest tolerances.

Circuits must be specifically designed and qualified for low temperature operation.
Common low-cost ceramic capacitor dielectrics (Z5U) are rated only to +15C and are useless by 0C. Y5P/Y5V are rated to -30C. X5R / X7R will get you to -55C.
Aluminum electrolytics are useless at low temperature; tantalum is required.

Been pretty shitty weather all summer here. Oh and I remember vividly when we were kids and we were coming down from a family trip in the winter, we couldn't play the new games we had bought before the next morning since the 8mhz bugger wouldn't boot until the "computer room"(porch thingy, badly insulated) heated up. And many many times we were playing games with our winter jackets on, maybe our parents were trying to discourage from being such nerds but they failed.

Why would you need an "environmentally controlled truck"? What about just using some basic insulation? Shipping in cardboard boxes would slow the temperature change near the electronics enough to prevent condensation.

Basic insulation like cardboard isn't going to cover it when you're looking at a 100F temperature difference between your server room and the outside, and you're possibly looking at a week's worth of shipping time.

A 'temperature controlled' truck doesn't even have to turn on said features until like 24-48 hours before delivery, but it's still useful.

I unloaded trucks in the wintertime in Nebraska. Nowhere near the arctic circle, but the cold seeped into said vehicles. Even with a hot air blower in the sto

Back in the day, I had to go to my data center when it was around 100 degrees out side so I was of course in shorts, t-shirt and sandals. I was there for 18 hours. Temperature inside was like 50 degrees. Yeah, that doesn't seem cold, but after 18 hours I felt like I had hypothermia.

A few years back while doing Tier 2 level support for a major Canadian telco, I started seeing overheating alarms from some Nokia DSLAM's. The odd thing was that it was -40C outside at the time. It turns out the fan's on these DSLAM's froze solid and the devices thought they where overheating and throwing alams left right and centre.
We had to put a tarp over them with a heater during the winter to make sure they kept going.

Luleå has to have one of the most extreme temperature ranges anywhere. Summer temp is quite consistently 15-20C with occasional peaks of 30C and winter temp is zero to 40C below. So the range is nearly 90C (130F)! This of course seasonal variation and not "rapid change" so data centers should not be affected by this. The fastest changes there are probably in winter when the temperature in rare cases can go from -40 (and zero humidity) to zero (and damp) in a day or two. That kind of change, especially the other way round, could mean trouble (condensation in air in/outlets etc.)

In fact, if google just wanted cold/dry climate, there has to be better locations. Northern sweden is mild, and has quite warm summers. Arctic inland climate further from the gulf stream atlantic would be more logical. Border between Russia and Finland for example. But there are probably logistical reasons (huge cargo airport, good port, good roads, railroads, lots of good technical people, ridiculous backbone connection) that placed the datacenter there.

slow down, cowboy! According to wikipedia, in Fairbanks the average is +22C for highs in July and -28C for lows in January; the record temperatures in Fairbanks are high +37C and low -54C; but these records were not set in the same year; so I doubt that Fairbanks gets "+40C to -60C over a year".
An even harsher climate, BTW, is found in Yakutsk: the average is +25C for highs in July and -41C for lows in January: this beats Fairbanks by 6C. The records are even more impressive: +38C for high , -64C

Naah, now you're pushing it. -30 is the usual minimum range for the whole of lappland, maybe -35 when it's really cold. But yeah, the summers are hot, you can swim outside in the rivers if you really want to. As for the reasons for the location of the datacenter, you're forgetting Luleå technical university.

Yeah typo. Confused by doing both C and F in my head. And of course inland climates have more extremes, like central us and a lot of russia. Very northernly inland locations should be more consistently cold (that is cool or cold depending on season€

Then it would have been well-known in that area already. I have been living in that area earlier and the issue of low temperatures and condensation was not one of the major concerns.

What people tends to forget is that when the outdoor temperature goes down the relative humidity indoors also drops considerably and that means that the condensation issues aren't that big. And the most sensitive parts are the hard disks, just wait to unpack them from the ESD bag until they have been up to room temperature. Same

Again, you are correlating marriage and sexual attraction. Homosexuals can and do marry people of the opposite sex, and have children with them. That doesn't necessarily mean they are in any way attracted to the opposite sex. All that it means is that, for whatever reason, they are willing to cover up their gender preference with a more socially acceptable relationship.