A Moore's Law for computers and energy efficiency

A research paper finds the electrical efficiency of computers doubles roughly every year and a half, which is what made laptops, smartphones, and tablets possible and what opens up possibilities for networks of wireless sensors.

Today's smartphones need to be charged far more frequently than older cell phones. But if it weren't for rapid improvements in energy efficiency, smartphones, laptops, and other mobile gadgets might still be on the drawing boards.

A paper published in the last edition of the IEEE's Annals of the History of Computing finds that there is a rough equivalent to Moore's Law when it comes to energy and computers. As computing muscle has increased over time, the amount of energy needed per computation has gone down, the paper finds. In fact, improvements in energy efficiency are driven by the same techniques engineers use to make microprocessors more powerful, including cramming more transistors onto a chip, according to researcher Jon Koomey, the lead author of the paper.

"The things you do to improve (computing) performance also invariably improve computations per kilowatt-hour," he said. "I do think these trends will keep going for a while, since we are very far from the theoretical limits."

Koomey and co-authors did a review of the number of computations computers can do per kilowatt-hour since the 1940s. It found that over time, computers did more work per energy input, with the number of computations per kilowatt-hour doubled about every year and a half.

This trend toward greater energy efficiency helped pave the way for the explosion in mobile computer products, such as laptops, the authors argue. "This development (of laptops replacing desktop computers) would not have been possible without long-term improvements in computational efficiency because battery technologies have not improved in the past nearly as rapidly as semiconductor technologies," they wrote in the paper.

The tandem increase in computing power and energy efficiency means that well designed computers can continue to become more energy efficient. But Koomey thinks the trend opens up the possibility for very small computational devices, such as sensors and controls. For example, sensors on a bridge could monitor the structure for potential damage and alert transportation officials when maintenance could be required. Or lighting sensors could provide just the right amount of light needed based on occupancy and daylight levels.

"I'm most excited by the developments in wireless sensing technology, which will allow us to collect environmental and other data at very fine levels of disaggregation. We think we're buried in data now, just wait until we start using wireless sensor networks on a grand scale," he said.

Even though computing energy efficiency is improving over time, the total amount of energy used by computers of all types is definitely on the rise. In separate research, Koomey found electricity use only from data centers, which serve all manner of PCs and mobile devices, grew 36 percent in the U.S. from 2005 to 2010 and 56 percent globally.

As for today's smartphones being power hungry, Koomey says the progress of batteries is a significant factor. "It also is affected greatly by what we expect these devices to do, and by any measure, current smartphones are delivering orders of magnitude more computing services than the old cell phones," he said.