Someone has constructed an LED that puts out 2.3 times as much energy as you put into it...it will be interesting to see if they find any way to scale this technology. It could at least, perhaps be adapted for use as a kind of heat sink.

The LED produces 69 picowatts of light using 30 picowatts of power, giving it an efficiency of 230 percent. That means it operates above "unity efficiency" -- putting it into a category normally occupied by perpetual motion machines.

However, while MIT's diode puts out more than twice as much energy in photons as it's fed in electrons, it doesn't violate the conservation of energy because it appears to draw in heat energy from its surroundings instead. When it gets more than 100 percent electrically-efficient, it begins to cool down, stealing energy from its environment to convert into more photons.

In slightly more detail, the researchers chose an LED with a small band gap, and applied smaller and smaller voltages. Every time the voltage was halved, the electrical power was reduced by a factor of four, but the light power emitted only dropped by a factor of two. The extra energy came instead from lattice vibrations.

Wandering in a vast forest at night, I have only a faint light to guide me. A stranger appears and says to me: 'My friend, you should blow out your candle in order to find your way more clearly.' The stranger is a theologian.

At such low power levels, IMO measurement error is a distinct possibility.

Even in the low-power electronics applications alluded to in the article, this thing would need to scale up by multiple orders of magnitude to be useful for anything. We're talking 0.00000000007 watts here, guys!

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

Common Sensei wrote:Someone has constructed an LED that puts out 2.3 times as much energy as you put into it...it will be interesting to see if they find any way to scale this technology. It could at least, perhaps be adapted for use as a kind of heat sink.

The LED produces 69 picowatts of light using 30 picowatts of power, giving it an efficiency of 230 percent. That means it operates above "unity efficiency" -- putting it into a category normally occupied by perpetual motion machines.

However, while MIT's diode puts out more than twice as much energy in photons as it's fed in electrons, it doesn't violate the conservation of energy because it appears to draw in heat energy from its surroundings instead. When it gets more than 100 percent electrically-efficient, it begins to cool down, stealing energy from its environment to convert into more photons.

In slightly more detail, the researchers chose an LED with a small band gap, and applied smaller and smaller voltages. Every time the voltage was halved, the electrical power was reduced by a factor of four, but the light power emitted only dropped by a factor of two. The extra energy came instead from lattice vibrations.

Okay. Nothing is 230% efficient. I don't care what the article says. Or what they claim. Or why it "doesn't break" conservation laws. It just CANNOT break it. The energy comes from somewhere. Oh wow it "draws it in" from the surroundings... that's energy consumption. That's not 230% efficiency.

Yeah, it's a bit of hyperbole to be sure. But the point is that you (allegedly... I'm still not convinced it's real) get more light out of it than the electricity you put in. It's exactly like how a heat pump can be more efficient than a normal furnace or air conditioning system because it augments its heating or cooling capacity by drawing/pushing heat from/to the outside air.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

You guys speak for yourselves. We're building a bridge to cross the Ohio that's going to have 2.3 cars exiting the bridge for every car that enters. This is the 21st century, guys. This ain't your grampa Einstein's universe anymore.

If the tech works as claimed, there's nothing magical (or thermodynamics violating) about it. The "hot" side of a heat pump puts out more heat than the energy required to run the pump. This is essentially the same thing, except the output is light instead of heat.

That said, as I've already noted it's a long way from a laboratory device that is dealing with picowatts to something that has practical applications.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

You are arguing over semantics. From the intended power source (of a battery for example) it is 230% efficient. For a system, it is NOT 230% efficient, but I don't see how much heat energy + electrical energy it takes to produce said light energy.

Irregardless this cooling effect may be used in microelectronics as each small transistor may only consume this much energy. It may be a way to bleed heat as light to effectively remove it from an enclosure.

Still you have to admit that even on this tiny scale, this is pretty awesome. And yes, extremely small currents and powers are reliably measured these days with low error. Just ask my brother-in-law who routinely measures current flow generated and across single cells.

If it scales up, this sort of phenomenon could be piggybacked on heat sinks to remove heat generated by LED lighting or other devices. Also, some specialized energy dispersive detectors could be self cooling if the emitted light spectrum is not close to energies being measured. Cooling directly in the lattice might lead to lower dark currents than cooling by conduction perhaps?

Mr Bill wrote:If it scales up, this sort of phenomenon could be piggybacked on heat sinks to remove heat generated by LED lighting or other devices. Also, some specialized energy dispersive detectors could be self cooling if the emitted light spectrum is not close to energies being measured. Cooling directly in the lattice might lead to lower dark currents than cooling by conduction perhaps?

That coudl be the first valid use of case lights ever in that case.

I do not understand what I do. For what I want to do, I do not do. But what I hate, I do.

Mr Bill wrote:If it scales up, this sort of phenomenon could be piggybacked on heat sinks to remove heat generated by LED lighting or other devices. Also, some specialized energy dispersive detectors could be self cooling if the emitted light spectrum is not close to energies being measured. Cooling directly in the lattice might lead to lower dark currents than cooling by conduction perhaps?

That coudl be the first valid use of case lights ever in that case.

Here are another couple of articles on this and the second from ars technica is recent.

The point is that this 230% efficiency only exists at extremely small potentials. You'd need like a trillion of these LEDs to produce any usable light output - not practical as a light source at this level of input voltage. Unless these LEDs can be made into an array at least as cheaply and on a scale the size of current transistors, I don't see how they'd be that useful as light source or heat engine.

cynan wrote:The point is that this 230% efficiency only exists at extremely small potentials. You'd need like a trillion of these LEDs to produce any usable light output - not practical as a light source at this level of input voltage. Unless these LEDs can be made into an array at least as cheaply and on a scale the size of current transistors, I don't see how they'd be that useful as light source or heat engine.

For specific use cases, this could be very useful. Like as a cell phone display light. Not only does it cool down the SoC, it lights the display efficiently.

The article states that when the power is halved by a factor of four, then the light output is halved. This means when you increase the power by a factor of 4, the light output is only doubled.

Given it is "230% efficient" (which in itself is a BS play on semantics), let's say I wanted, oh, a watt of light. Just one watt. If the quadrupling the power to double the light holds true, we're talking megawatts of power.

yogibbear wrote:Okay. Nothing is 230% efficient. I don't care what the article says. Or what they claim. Or why it "doesn't break" conservation laws. It just CANNOT break it. The energy comes from somewhere. Oh wow it "draws it in" from the surroundings... that's energy consumption. That's not 230% efficiency.

Actually, any engineering students who've done any undergrad level thermodynamics classes should be familiar with air conditioning systems which are greater than 100% efficient. Note however that entropy is not violated, as the heat output has higher entropy than the original system.

In both these cases, the higher than unitary efficiency comes from drawing on extra energy gradients which are "free", hence you get more output (heat, light) than your input energy (heat, electricity). Energy is still conserved AND you get more than 100% efficiency.

flip-mode wrote:You guys speak for yourselves. We're building a bridge to cross the Ohio that's going to have 2.3 cars exiting the bridge for every car that enters. This is the 21st century, guys. This ain't your grampa Einstein's universe anymore.

The article states that when the power is halved by a factor of four, then the light output is halved. This means when you increase the power by a factor of 4, the light output is only doubled.

Given it is "230% efficient" (which in itself is a BS play on semantics), let's say I wanted, oh, a watt of light. Just one watt. If the quadrupling the power to double the light holds true, we're talking megawatts of power.

Seems to me the concept is that you run each one at the almost vanishingly low power level where it works very efficiently, but make up for the low output by having millions of microscopic versions of this thing etched onto a single chip. Since light output will scale linearly with number of LEDs, this theoretically could get you useful amounts of light at very high efficiency.

Such a device would be the equivalent of a Peltier device where the "hot" side emits visible radiation instead of infrared (heat).

Another amusing implication of this phenomenon: Such a device would cool itself down as it operates, and eventually stop working because it gets too cold! There would need to be the inverse of a heat sink (a cold sink?) attached to it to enable it to operate reliably!

Voldenuit wrote:

yogibbear wrote:Okay. Nothing is 230% efficient. I don't care what the article says. Or what they claim. Or why it "doesn't break" conservation laws. It just CANNOT break it. The energy comes from somewhere. Oh wow it "draws it in" from the surroundings... that's energy consumption. That's not 230% efficiency.

Actually, any engineering students who've done any undergrad level thermodynamics classes should be familiar with air conditioning systems which are greater than 100% efficient. Note however that entropy is not violated, as the heat output has higher entropy than the original system.

Yup, systems that utilize this effect can actually have a "coefficient of performance" of up to 5.0 (i.e. "500% efficiency", if we use the misleading interpretation from in the original article).

cynan wrote:The point is that this 230% efficiency only exists at extremely small potentials. You'd need like a trillion of these LEDs to produce any usable light output - not practical as a light source at this level of input voltage. Unless these LEDs can be made into an array at least as cheaply and on a scale the size of current transistors, I don't see how they'd be that useful as light source or heat engine.

For specific use cases, this could be very useful. Like as a cell phone display light. Not only does it cool down the SoC, it lights the display efficiently.

Not as the technology currently stands it's not. Unless you could fit an array of a trillion(s) of these LEDs behind the cell phone display, you'd simply not get enough light output. And when you ramp up the voltage, you loose cooling and efficiency above unity.