Posted
by
timothy
on Tuesday June 17, 2008 @09:53AM
from the blinky-blue-is-the-new-dull-amber dept.

Hugh Pickens writes "For years, data center designers have toiled in obscurity in the engine rooms of the digital economy, amid the racks of servers and storage devices that power everything from online videos to corporate e-mail systems but now people with the skills to design, build and run a data center that does not endanger the power grid are suddenly in demand. 'The data center energy problem is growing fast, and it has an economic importance that far outweighs the electricity use,' said Jonathan G. Koomey of Stanford University. 'So that explains why these data center people, who haven't gotten a lot of glory in their careers, are in the spotlight now.' The pace of the data center build-up is the result of the surging use of servers, which in the United States rose to 11.8 million in 2007, from 2.6 million a decade earlier. 'For years and years, the attitude was just buy it, install it and don't worry about it,' says Vernon Turner, an analyst for IDC. 'That led to all sorts of inefficiencies. Now, we're paying for that behavior.'" On a related note, an anonymous reader contributes this link to an interesting look at how a data center gets built.

This is only a problem because the power grid has become very fragile.

Electricity generation hasn't grown ahead of demand due to government meddling, atom-ophobia, and environmentalist obstruction in the courts and on planning boards.

The rolling blackouts will be coming soon. It'll start with small ones. Then everyone will buy battery backups that draw a lot of power to recharge once power is restored. This will cause the duration of the periodic blackouts to go from a few minutes to a few hours in about 2 years.

Not long after that, we'll start building power generation capacity in the US again.

I don't understand the peculiar emphasis the New York Times places on "endangering" the power grid. Even though a data center uses a lot of electricity, it's a high value operation that needs a stable power supply. What's wrong with the idea of paying more to insure that your power supply is sufficiently stable for your needs? The power company accepting those checks can then work on delivering that power. It's like saying that I'm somehow responsible for the stability of the oil production and distribution infrastructure because I drive a car. Perhaps, if I tweak my engine just so, I can engineer a democratic transformation of Saudi Arabia. I'll see if changing the oil does the trick.

At some point, you have to realize that the consumer, no matter how big, isn't responsible for the supply of resources by another party. If there's a problem with how those resources are supplied, be it fixed price (regardless of demand) power transmission lines, pollution, or deforestation, then that problem should appear as an increase in cost to the consumer. If it isn't, then it's a problem with how the resource is distributed, not a problem with the consumer.

On the other hand, it might be the final push that people need to start making their homes and businesses as energy efficient as possible, up to and including home solar and/or wind; use of more energy-efficient appliances, low-power-consumption electronics, etc.

I would dare say that the future looks good for ARM and Via on that last account, at least.

There has been a shortage of architectural engineers for the past two decades. I say architectural engineers because very few mechanical engineers go into HVAC, and very few electrical engineers do power systems. It doesn't seem quite as bad structurally.

It us a shame because it really has a lot of great career opportunities.

Data center work is just a subset of that-- it is hard to find people with the experience, but not impossible to train.

We just thought about doing this for a slightly different reason. Trailers are a funny loophole in US regulations. If you pull a trailer inside a building, the building inspectors come (enviro, electric, whatever) and you just tell them it is a trailer that is currently stored inside. They assume it is regulated by the DMV or some other travel saftey org. But if you never actually drive the trailer on the roads, you never incur that regulation/inspections. Seriously, trailers are a sneaky way to avoid some regulations - legally.

Well, between uninterruptible power, and air conditioning, datacentres are probably one of the highest 'power overhead' applications. There's a hell of a lot of 'waste' there, which you can design out, in some measure.

But yes, it's a priority application too - datacentres score as 'business critical' in most companies, so no matter how much it costs to run, it's cheaper than it 'not running'.

Part of the point of DC design is resiliency, and therefore you _do_ have to consider available services and supplies - like the local powergrid, and how screwed you'll be if it does hit the breaking point.

On the "how a data center gets built" front, last week I had a tour of a new $250 million data center facility in Virginia that is getting ready to open later this month. The facility manager provided a walk-through of the power and cooling infrastructure, explaining the company's approach to designing these systems for energy efficiency and scale. I shot video, which is now posted online [datacenterknowledge.com]. The data center operator, Terremark, separated most of the electrical infrastructure from the IT equipment, putting them on separate floors and housing the generators in a separate facility. They have 11 generators now, but will have 55 Caterpillar 2.25-megawatt units when the entire complex is finished.

All companies face those same challenges, including Microsoft and Google. I work mostly with banks, and we are always faced with micro and macro change and growth planning. With help, some banks can go from a one-quarter projection to a reasonable three year projection. Five to six years is harder, but sometimes possible.

The biggest secret is in providing enough space to allow for growth, changing needs, and eventually equipment replacement.

As for efficiency, I have to tell an aspiring co-lo that they will pay more for power than their OC-192s, and thatthe cost of a server is less than the power it consumes. It is easy for growing companies to ignore it at first, but it eventually catches up with you.

The old solution was to move the servers to a place with cheap electricity. That will backfire soon; you really need to shift focus to plan for energy efficiency, even if it means your fiber runs are longer (segregate by density rather than system or function).

While it may appear that you don't have to work hard to cool the data centers, you will have to work hard to humidify them if you do not want your equipment to die. This is a non-trivial cost and is the reason the "free cooling" (taking in outside air to cool a data center) is often not free.One answer may be heat wheels, but they are fairly new and unproven in the data center space. Take a look at http://www.kyotocooling.com/ [kyotocooling.com]

Oh yes! Carpet in a server room. I wouldn't even put "NSA carpet" in there -- it has conductive filaments to ground out any EMI.

I had the same several month long arguements in planning our new office. It's expensive. We cannot raise the ceiling (the building HVAC systems are in the plenum.) Do we really need 5ton air handlers. Do we have to have 2 of them. etc. etc. Well, my 12" floor became a 10" floor -- a compromise to make the ramp 2ft shorter, and 2 Lieberts became one because no one listened to my original specs and the landlords wouldn't buy the second one. (those things are expen$ive.)

Btw, that single point of failure failed within *4* months requiring basically the entire office to be shutdown all day to get it fixed. It was over 100F in there in less than an hour.

I've been a low level tech grunt for almost 20 years. Nice thing is, I'll always have work. Am the 21st C. equivalent of a general auto mechanic. Nice thing is, I've been able to make a decent living and pretty much pick my work environment. Have never been afraid to pick up and leave, even with family. What's funny is, there are several younger guys in my shop and they are chafing at the work. They don't like the hands on stuff and are just bitching and moaning, hoping to get a team lead position. Only one guy is doing anything about it, going to school for his MBA. Hopefully, he'll retain some sense of just what can and can't be accomplished via IT and won't become a bone headed manager.