If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

• During recent maintenance and software update, account email addresses were reset. Consequently email notifications of private messages and posts to subscribed threads, as well as password recovery could not be sent.

• Please go to your Settings (click on "Settings" in the upper right corner of any page), then go to "Edit Email & Password" in the left menu bar under "My Account". Make sure your current and valid email address is listed and click on "Save Changes" whether you changed anything or not.

• This will update your email notification settings and you should start receiving them again.

• Be sure to use a VALID email address!! If you make any changes, you will receive an email to verify the changes made to your account. You will need to click on the link in the email in order to carry on.

LEDs waste 75% as heat

The ideal LED would convert all the electrical energy into light, but it's clear that a lot gets converted to heat.

Unlike filament bulbs, most of the heat needs to be conducted out of the back of the LED to keep the LED cool - the hotter it gets, the less efficient it is.

I've seen a few attempts on CPF at working out how much heat an LED puts out, but I think the only reliable method is substitution - find out how much heat will raise the same heatsink to the same temperature. 100% of the electrical energy that goes into a resistor gets converted to heat - it's very hard to make an inefficient electrical heater !

So over a normal operating range, modern high power LEDs waste around 75% of the power going in to them.

Of course this testing method ignores radiant heat loss from the LED - when I hold my hand 1cm in front of the LED it gets a lot warmer than holding it 1cm in front of a resistor putting out the same heat. EDIT- as pointed out by jtr1962, the warmth I feel would be due to the light being converted to heat when it strikes my skin - the energy has to go somewhere.

Re: LEDs waste 75% as heat

Originally Posted by MikeAusC

Of course this testing method ignores radiant heat loss from the LED - when I hold my hand 1cm in front of the LED it gets a lot warmer than holding it 1cm in front of a resistor putting out the same heat.

Good work but don't confuse the phenomenon you're describing with radiant heat loss. In order to have enough radiant heat loss to feel it with your hand 1 cm away, the LED die would need to be a few thousand degrees. What you're feeling is the light energy from the LED absorbed by your hand, and turned into heat. I first noticed this with a Rebel where if I put black electrical tape very close to the LED dome, it would become hot enough to start smoking. Even my lighter colored finger quickly gets too hot to hold above the dome. It's an interesting phenomenon.

Incidentally, the 10 watt resistors end up slight increasing the surface area from which heat is dissipated compared to when the LED is mounted, so your experiment might be overestimating the amount of power required for a given heat sink temperature rise. At 1 amp, a T6 XM-L should be outputting about 370 lumens. This might be equivalent to around 1.1 watts of light energy, so the heat would be 1.85 watts instead of 2.04 watts. A slight refinement then might be to put insulation over the resistor body so that there is as little heat dissipation there as practical. Other than that, great experiment and interesting results!

Re: LEDs waste 75% as heat

I had thought of using the Dichroic Reflector that's used with 50mm Halogen Bipin lamps - they're designed to reflect the light forward, but let the heat pass throught the reflector, to avoid setting fire to the object being lit up. If you look at the bulb filament from the back of the reflector, you can see it only lets a small amount of the light through.

So you could arrange the LED so the radiant heat which passes through dichroic reflector also heats up the heatsink, but the reflected light radiates into space.

Re: LEDs waste 75% as heat

Originally Posted by jtr1962

. . . .What you're feeling is the light energy from the LED absorbed by your hand, and turned into heat. I first noticed this with a Rebel where if I put black electrical tape very close to the LED dome, it would become hot enough to start smoking. Even my lighter colored finger quickly gets too hot to hold above the dome. . . .

Good point - I'd forgotten that even light will get converted to heat. If you have a light inside an opaque container, the only way the energy can escape, is as heat from the outer surface of the container.

Re: LEDs waste 75% as heat

Originally Posted by MikeAusC

I had thought of using the Dichroic Reflector that's used with 50mm Halogen Bipin lamps - they're designed to reflect the light forward, but let the heat pass throught the reflector, to avoid setting fire to the object being lit up. If you look at the bulb filament from the back of the reflector, you can see it only lets a small amount of the light through.

So you could arrange the LED so the radiant heat which passes through dichroic reflector also heats up the heatsink, but the reflected light radiates into space.

Interesting idea. I'm reasonably sure though the amount of energy leaving an LED as radiant heat can be measured in microwatts, and is thus entirely negligible for the purposes of this experiment. Remember that radiant heat is proportional to absolute temperature to the fourth power. If the LED die were at incandescent lamp temperatures, it might radiate a couple of watts. However, at perhaps 50°C, it'll only radiate about 1/10000th the power.

In any case, your assessment that only about 25% of the input power is converted to visible light seems quite correct and reasonable for an LED operated at medium to high currents. You can approach or even exceed 50% at very low currents, but at the expense of using more LEDs. I expect by the end of the decade we'll be pushing efficiencies of 75% to 80%. Such efficiencies have already been reached in the lab.

Re: LEDs waste 75% as heat

This is hard to measure, and your methods are somewhat shot-in-the-dark for accuracy.

Just look it up. Lumens are a measure of total light power, rather than a light intensity that increases with focusing. Unfortunately, rather than a fixed relationship between lumens and power, lumens is a scale compensated for human eye response. 5mW of green laser appears 10x brighter than 5mW of red laser. A green LED putting out 100mW of green light will score about 10x the lumens of a red LED putting out 100mW of red light.

Given a particular wavelength or white color temp, you can look up how many watts of light energy = 1 lumen. So look up the spec sheet's "typical" for current, voltage, and lumens and you can calc the efficiency straight-out.

Re: LEDs waste 75% as heat

i tried to figure a margin of error for mikeausc s work, but figured it didnt matter or affect the conclusion.
i think he clearly demonstrated that while it seems leds are quite efficient (energy to light 25 percent seems a long way from a candle or a heated wire) theres still plenty of room for improvement. so we are probably not at the peak just yet.

Re: LEDs waste 75% as heat

What you see as light ... what you feel as radiated heat - its all the same thing. Just different frequencies.

Yeah, but I feel that is oversimplifying things. That's like saying that a candle and a plasma torch are the same thing.

Light and heat may fall under the electromagnetic umbrella, but they are definitely different as far as how humans perceive them. Aka, you can see heat, and you can feel light, but you're much more likely to be able to perceive them vice versa.

Light IS a form of energy. Various sources of both will emit the other, but let's be clear, heat is NOT light energy.

Re: LEDs waste 75% as heat

Originally Posted by onetrickpony

That's like saying that a candle and a plasma torch are the same thing.

Getting OT here .. but thats not what I mean, although I think we are agreeing with each other though A plasma torch will radiate infrared heat which you can feel, but it also has a cutting flame of hot plasma, which yes is obviously a very different kettle of fish. The plasma does the cutting but it also loses some energy as radiant (IR) heat which you can feel. Applying the torch to your hand results in a different method of energy transfer (and would be a bit painful).

Originally Posted by onetrickpony

Light IS a form of energy. Various sources of both will emit the other, but let's be clear, heat is NOT light energy.

I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".

Re: LEDs waste 75% as heat

Lotta confusion going on in here with numbers and what is and isnt heat.

Theoretical maximum for white light is somewhere around 300lm/W. So Cree XM-L driven at 0.35A will have 50% efficiency. Of course it goes down with increasing current.

Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.

I agree but can I reword it as "Electromagnetic radiation within certain ranges of frequencies can be detected by the human body as either 'light' or 'heat' with varying sensitivities".

close but still confusing, possibly misleading. The only sense organ in the body that directly detects electromagnetic frequencies is the eyes, 'heat' as felt by the skin, is an indirect sense from the heating of the skin itself. I.E. a thermometer doesnt sense electromagnetic frequencies, but it you shine a 1W laser on it, its still gonna go up.
Various frequencies of the elctromagnetic spectrum are absorbed by the body at differing rates, so heating varies. Everything from UV down to a few GHZ will be absorbed at least partially by the skin, causing local heating, felt as warmth. UV and above is ionizing and bad and starts to just zing right though you, breaking DNA along the way and causing cancer, but not inducing much heating, stuff below a few GHz penetrates the body in varying depths, causing internal heating, but thats not felt as there isnt heat sense organs inside your body, so by time you do feel it, damage via heating can be done. And lower frequencies have wavelengths larger then you and pass though without a care in the world.

Yes, longwave IR cameras are traditionally called "heat vision", but they just see the IR emissions from the warm object. Kind of like how incan lamps emit light, you emit IR light also, just in the 310K spectrum. IR just falls into a sort of niche where we cant see it, we can feel the heating effects due to it being absorbed by the skin very quickly, its too high frequency for us to do direct emissions via an antenna (Getting there! THz transmitters arent stuff of fiction anymore)

I would post a certain picture of jackie chan right now, but I know the moderators on here wouldnt be too fond of it

Re: LEDs waste 75% as heat

Originally Posted by CKOD

Lotta confusion going on in here with numbers and what is and isnt heat.

Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.

Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

Re: LEDs waste 75% as heat

Originally Posted by MikeAusC

The point of this experiment was to answer the frequently-asked question "I'm feeding x watts to my LED, how much heat will I have to remove."

excellent experiment MikeAusC. It's nice to have a rough guideline for this sort of thing. Do you have any idea how similar the results would be with other LED's? Would results correspond to the relative efficiencies of the LED tested vs the XM-L T6?

Re: LEDs waste 75% as heat

Originally Posted by slebans

Not sure where you are coming from but if the Luminous Efficacy of Radiation(LER) value for the XM-L at 350ma is 300 lumens and the LED emits 150 lumens then the electrical efficiency of the LED is 50%(watts in to watts out).

Stephen Lebans

Ahh ok, I thought you were referencing a theoretical ideal LED vs the actual led, not the lm/w value for its emisison spectrum vs its actual output. That makes more sense and is correct then, though there would be some error introduced by the color temperature, but thats not as significant.

Re: LEDs waste 75% as heat

Originally Posted by CKOD

Incorrect statement, if it achieves 150 lm/W and the theoretical max is 300 lm/W, then it achieves 50% of theoretical maximum efficiency, not an efficiency of 50% ( watts out / watts in ) I wish I could find a good table of wavelength vs lumens per watt, Lumens are defined by the response to the eye compared to 1W of light energy. In daytime vision 555nm is ~600 lumens per watt IIRC, and it its lower for all the other frequencies, tapering off towards infrared and UV obviously.