Recent Articles

Thanks to a poor economy, the more aggressive use of virtualization, and the rise of more efficient iron, the world's data centers are consuming less energy than expected, according to a new report from one prominent Stanford University researcher.

In early 2007, Jonathan Koomey – a staff scientist at the Lawrence Berkeley National Laboratory and a professor at Stanford University – created a model of the world's server installed base and electricity usage for these servers. Based on data from a report to the US Congress from the Environmental Protection Agency and IT market watcher IDC, this original Koomey report showed that power consumption of installed servers in both the United States and across the world was growing faster than the installed base of machines.

It also indicated that data centers would soon start slurping big mouthfuls of juice out of the global electric grids.

Nearly a year later, chip maker Advanced Micro Devices commissioned Koomey to do a follow-up study based on finer-grained data from IDC. That data and the prognostications for 2010 that Koomey put together at the time were done just as the United States slipped into the Great Recession, eventually pulling down most of the rest of the world with it.

That second report projected that power consumption in the world's data centers would double between 2005 and 2010, repeating a doubling of consumption from 2000 to 2005. But whereas the Great Recession was awful for most of us, it has apparently been great for data center efficiency, according to the latest report from Koomey, which was just commissioned by the New York Times.

Server growth shrinkage

A number of different factors combined to lower the electricity bill at glass houses and data closets, but perhaps the largest factor was that the worldwide installed base of servers did not grow as fast as IDC had been projecting. IDC reckoned that the United States had 5.61 million servers in 2000 and 10.3 million in 2005 with projections for the base to grow to 15.8 million by 2010.

In the third report from Koomey, which uses more recent IDC data, we discover that shipments were a tiny bit lower in 2005 in the States, and through the end of 2010, they were a lot lower than expected: just under 11.9 million units, or nearly 4 million servers fewer than anticipated based on past trends.

According to the latest IDC data, the server base did not quite double from 2000 to 2005, and it didn't grow as fast as anticipated either. IDC said that there were 14.1 million servers in the world in 2000, that they grew to 27.3 million by 2005, and that they would hit 41.2 million by 2010. As it turns out, the world had 2.2 million fewer servers in 2005 than IDC thought and the global base at the end of 2010 stood at 32.7 million, nearly 8.5 million fewer machines than expected.

When you add all the juice up across the servers as well as related storage, networking gear, and data center infrastructure for power distribution and cooling, that worked out to 28.2 billion kilowatt-hours of juice burned in the data centers in the United States in 2000, or about 0.82 per cent of total electricity generated, doubling to 56 billion kilowatt-hours in 2005, about 1.53 per cent of all juice produced.

In early 2007, based on then-current historical data for the US, Koomey projected that 135.1 billion kilowatt-hours would go up in coal smoke powering IT gear in 2010 without any changes to behavior, but based on advances in server virtualization and power efficiency inside servers and other gear and in data center design, Koomey guessed that it would actually only rise to 107.9 billion kilowatt-hours in 2010 in the United States.

Power range

Based on the most recent data from IDC on server installed base and a range of power efficiency for the gear, Koomey now says that it looks like the data centers and closets in the US only burned somewhere between 67.1 and 85.6 billion kilowatt-hours, which works out to between 1.73 and 2.2 per cent of total electricity produced in the country last year. That is far better than the historical trend estimate of 3.78 per cent and significantly better than the best-case scenario from 2007, which had IT gear and their fancy housing burning up 2.78 per cent of all electricity in the country.

Worldwide IT power consumption for 2000, 2005, and 2010 (click to enlarge)

Worldwide, the numbers are considerably better too. Koomey estimated that in 2000, all the IT gear and their crunchy data center wrappings consumed 70.8 billion kilowatt-hours of electricity, more than doubling to 152.5 billion kilowatt-hours by 2005. The historical trend on IT gear installed base growth and power consumption for machinery would have the world eat 198.8 billion kilowatt-hours in 2010 if nothing changed, and Koomey's 2007 estimates were for efficiency efforts to kick in and help push that down to 301.1 billion kilowatt hours, or about 1.66 per cent of all of 1.8 trillion kilowatt-hours of juice that would be produced.

As it turns out, Koomey now says that all of the data centers and data closets worldwide consumed somewhere between 203.4 and 271.8 billion kilowatt-hours of electricity, or somewhere between 1.12 and 1.5 per cent of total electricity. When contacted by El Reg about making projections for the power consumption for data centers for 2015, Koomey cited a quote by physicist Niels Bohr: "Predictions are very difficult, especially about the future."

Having taken a stab at electricity consumption in the past, and not seeing the recession and its effects coming, Koomey is not eager to do it again.

"There is certainly a lot of room to drive more efficiency," Koomey wrote in an email to El Reg. "The data center report to Congress shows that in theory there are enough options to reduce electricity use from current levels, but I'm not in the predictions biz." ®