Researchers at MIT have taken light emitting diodes' vaunted efficiency to new heights: They've proven that an LED can emit more power than it consumes.

A team led by Parthiban Santhanam found that an LED powered by 30 picowatts of electricity emitted 70 picowatts of light power - an efficiency of over 230 percent - according to a synopsis by the American Physical Society.

The same sort of result does not hold true at higher wattages that would be of any use for say, reading a book or lighting a room. No matter how efficiently the LED operates, 70 picowatts - that's 70 trillionths of a watt - won't illuminate many dark corners, or pages of War and Peace. Although today's efficient 20-watt LED bulbs shine as brightly as a 100-watt incandescent, we'd have a long way to go before 70 picowatts will serve any functional purpose.

The MIT results almost seem to conjure up power. Photo from Chris Dlugosz via Flickr.

But that's not the point. What's important is that the MIT team took a new approach to trying to improve LED efficiency. In the conventional quest, scientists try to increase the number of photons emitted per electron that travels across the LED, which is a semiconductor.

Santhanam and crew worked by decreasing the voltage, and found that a reduction in voltage decreased electrical input power more than it decreased outgoing light power (remember, "voltage" is not a measure of power, but of electric potential). They used a particular LED with high conductivity.

"With each halving of the voltage, they reduced the electrical power by a factor of 4, even though the number of electrons, and thus the light power emitted, dropped by only a factor of 2," according to the synopsis.

The website physorg explains that the process tapped excess heat from vibrations in the LED's atomic lattice. That holds potential for designing lights that don't generate heat, it notes, adding, "When used as a heat pump, the device might be useful for solid-state cooling applications, or even power generation."