If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

• During recent maintenance and software update, account email addresses were reset. Consequently email notifications of private messages and posts to subscribed threads, as well as password recovery could not be sent.

• Please go to your Settings (click on "Settings" in the upper right corner of any page), then go to "Edit Email & Password" in the left menu bar under "My Account". Make sure your current and valid email address is listed and click on "Save Changes" whether you changed anything or not.

• This will update your email notification settings and you should start receiving them again.

• Be sure to use a VALID email address!! If you make any changes, you will receive an email to verify the changes made to your account. You will need to click on the link in the email in order to carry on.

Re: LEDs waste 75% as heat

so , if we had a new led, type W, that converted 90 % of the input electrical energy to radiation of a non visible wavelength , it would generate little heat and would be rated 'very inefficient'. see no reason not to like mikes concept.

Re: LEDs waste 75% as heat

Someone needs to come up with a driver that regeneratively converts heat back into electricity. I hope if someone reads this, then does it, that I AT LEAST get some credit, if not a nice fat check in the mail. Thanking you in advance....

Re: LEDs waste 75% as heat

Originally Posted by onetrickpony

Someone needs to come up with a driver that regeneratively converts heat back into electricity. I hope if someone reads this, then does it, that I AT LEAST get some credit, if not a nice fat check in the mail. Thanking you in advance....

There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.

Re: LEDs waste 75% as heat

Originally Posted by jtr1962

There is an entire body of research devoted to converting low-level waste heat into electricity via improved thermoelectrics. The laws of thermodynamics say that even best case you won't be able to convert much waste heat from an LED because conversion efficiency increases with source temperature. An LED by definition can't run at a high enough temperature to make converting waste heat worthwhile. On the flip side, if someone were to invent relatively efficient thermoelectric convertors which could survive being placed in close proximity to a lamp filament, then you could potentally recover a large percentage of waste heat in theory. In practice this wouldn't make much sense. Even with waste heat recovery, overall the lamp would still be less efficient than LED or fluorescent sources. The main application for waste heat recovery is power plants where even a 1% efficiency increase translates into millions of dollars.

I'm pretty sure that the efficiency of the converter is less relevant than the cost of the converter relative to the savings it provides. In other words, if someone comes out with a miniature converter that costs $3 and saves 100 ma per hour on a 10 watt led, you've got progress.

Re: LEDs waste 75% as heat

Originally Posted by onetrickpony

I'm pretty sure that the efficiency of the converter is less relevant than the cost of the converter relative to the savings it provides. In other words, if someone comes out with a miniature converter that costs $3 and saves 100 ma per hour on a 10 watt led, you've got progress.

The theoretical maximum efficiency of a heat engine is 1 - Tc/Th, where Tc is the cold side temperature and Th is the hot side temperature. LEDs generally need to keep the die at less than 100*°C ( 373K) for decent life. Assuming you use some fancy setup where the cold side can be close to room temperature ( ~290K ), then the maximum possible efficiency of a thermoelectric is ~22%. Now I've been studying the development of thermoelectrics for a long time as a personal interest of mine, mostly for their refrigeration capabilities as opposed to use for heat recovery (although the same device can do both). Most studies I read point to achieving 50 to 60% of Carnot efficiency as "ambitious and probably unlikely". Assuming we reach this lofty goal (and people have been trying for the last 50 years), then you're talking about recovering ~13% of the waste heat at the upper limit of the LED's operating temperature. Since the goal here is more efficient conversion of power to light, you can achieve about the same increase by simply bringing the die temperature close to room temperature via a better thermal path. In fact, generally power savings is mainly a concern for general lighting where the LEDs must operate at lower die temperatures for long life. These lower die temperatures are even less conducive to heat recovery. For example, at a 60°C maximum die temperature, which is often a goal for long-life general lighting, maximum possible Carnot efficiency drops to not much over 10%. A practical device might not recover more than 5%. OK, 5% might still make a difference, but only if the cost of the device exceeds the power savings over the life of the LED. If we have a 100 watt LED, then we save 5 watts. Over the LED's 100,000 hour life that's 500 kW-hr, or about $50 at today's average electric rate of 10 cents per kilowatt-hour. Can we make an efficient thermoelectric capable of recovering 5 watts at such a low temperature differential, along with the associated heat sinks/pipes/fans to get the cold side as close to ambient as possible, all for $50 or less? I doubt it. Might as well just use that same heatsinking setup on the LED itself. If you can drop the die temperature by that same ~25°C temperature difference over which the thermoelectric is recovering heat, you increase output (i.e. efficiency) by about 6 or 7 percent. In short, it makes more sense to just bring die temps as close to ambient as possible. In fact, regardless of the die temperature, it makes more sense to just cool the LED better than to try and recover waste heat. Don't forget that waste heat recovery systems by definition use huge heat sinks to get the cold side as close to ambient as possible.

On another note, thermoelectrics which operate at a good fraction of the Carnot efficient might make sense to use to cool the LED die. As a refrigerator, Carnot efficiency is Tc/Th-Tc. Note how this can be much larger than 1 if Th-Tc is small. For example, if we bring the LED die down from 50°C to 25°C, we'll obtain the aforemented 6-7 percent increase in output. Maximum Carnot efficiency doing this would be 11.92 which is ~12. Assume the 100 watt LED is 50% efficient at converting power to light, so the heat load would be 50 watts. Operating at half Carnot efficiency we would need 50/6 = 8.33 watts. This is about 8% more power in order to obtain a 6-7% increase in light. It's almost worthwhile. If you can approach Carnot efficiency then you actually end up using less power to cool the LED compared to the increase in output. Still probably not worthwhile from an economics standpoint, but at least here you stand to actually increase overall LED efficiency by using a thermoelectric (in theory anyway). And a while back I did an experiment which vividly demonstrates the effects of cooling LEDs. Even with the much more temperature sensitive output of an amber LED, I concluded that cooling LEDs with today's thermoelectrics has no practical value.

Re: LEDs waste 75% as heat

i followed your link . thx for sharing.
i have some questions please.
does that sort of efficiency curve hold for xpg and for temperatures from 25 deg to 60deg ?
and does the lost output all become heat ?

Re: LEDs waste 75% as heat

Originally Posted by beerwax

i followed your link . thx for sharing.
i have some questions please.
does that sort of efficiency curve hold for xpg and for temperatures from 25 deg to 60deg ?
and does the lost output all become heat ?

No, actually the output of the amber Luxeon I tested varies MUCH more with temperature than an XP-G would. An XP-G would only increase output by perhaps 8-9% if die temperature were reduced from 60°C to 25°C.

are we going to see thermoelectrics in motor vehicles.

If we can get Carnot efficiency up, then there is talk of replacing the A/C compressor with thermoelectrics. There is also some talk of increasing mpg by converting waste heat from the engine into electricity. I'm personally dubious of this second possibility because the car would need an electric motor to make use full of this generated power, and in my opinion internal combustion engines will be obsolete for ground transport within a decade anyway due to improved batteries for EVs. Nevertheless, using thermoelectrics for A/C will remain viable regardless of the vehicle's source of motive power, and I think we'll see this.

On another note, I've been waiting since the early 1990s for improved thermoelectrics. What's available commercially now isn't a whole lot better than what was available then. I've read some papers, such as this one which talk of great improvements over today's devices, but I've yet to see any reach production. Page 2 for example shows that a single stage cooler based on their approach could reach 130K. I would love to get my hands on a thermoelectric with that kind of performance. With what exists nowadays, I'm lucky to approach 200K (and that's with a two-stage setup, water cooling, and very little heat load). The best I've done with bulk cooling is to get my temperature chamber down to -58°F ( -50°C = 223K ).

Re: LEDs waste 75% as heat

Re: LEDs waste 75% as heat

If you were to express how efficient LEDs are in VISIBLE light in PERCENT, you will have to measure how many watts of radiant energy is coming out within the visible light spectrum. Anything outside of this band is a waste, just like infrared from incandescent lamps.

Re: LEDs waste 75% as heat

Originally Posted by Bright+

If you were to express how efficient LEDs are in VISIBLE light in PERCENT, you will have to measure how many watts of radiant energy is coming out within the visible light spectrum. Anything outside of this band is a waste, just like infrared from incandescent lamps.

exactly, and even within the visible band of light, the human eye sensitivity varies greatly with wavelength. e.g. 1W radiant output of 750nm red would still be visible but appear several orders of magnitude less bright than 1W 588nm green. The 750nm would be a dim glow and the 588nm would appear extremely bright even if both were produced by a 100% efficient source. Therefore IMO, it doesn't make sense at all to quote efficiency as a percentage which is why we quote things in lm/w.

I guess the OP was distinguishing conducted and radiated energy, so in this case it does make sense.

Re: LEDs waste 75% as heat

This only means the real efficiency is even less (he didn't measure so little heat to worry about invisible spectrum, on the contrary).
Unless the 294lm/W phosphor efficacy is overstated, or there's something quite wrong with the measurement (unsinked resistor heat) it seems like that led has 2/3 of the expected efficiency (t6 should be about 46% efficient at 1A, but is at most 31%)

Re: LEDs waste 75% as heat

Originally Posted by jtr1962

If we can get Carnot efficiency up, then there is talk of replacing the A/C compressor with thermoelectrics. There is also some talk of increasing mpg by converting waste heat from the engine into electricity. I'm personally dubious of this second possibility because the car would need an electric motor to make use full of this generated power, and in my opinion internal combustion engines will be obsolete for ground transport within a decade anyway due to improved batteries for EVs. Nevertheless, using thermoelectrics for A/C will remain viable regardless of the vehicle's source of motive power, and I think we'll see this.

On another note, I've been waiting since the early 1990s for improved thermoelectrics. What's available commercially now isn't a whole lot better than what was available then. I've read some papers, such as this one which talk of great improvements over today's devices, but I've yet to see any reach production. Page 2 for example shows that a single stage cooler based on their approach could reach 130K. I would love to get my hands on a thermoelectric with that kind of performance. With what exists nowadays, I'm lucky to approach 200K (and that's with a two-stage setup, water cooling, and very little heat load). The best I've done with bulk cooling is to get my temperature chamber down to -58°F ( -50°C = 223K ).

Currently essentially ALL thermoelectrics are bismuth-tellurium junctions. This type is VERY limited, and are about 1/10th the cooling efficiency of phase change (freon) systems used in air conditioning and refrigeration. And that's about the extent of bismuth-tellurium technology and it's not expected to get better. It will not replace them.

The quantum well superlattice form of thermoelectric device IS being worked on, and DOES seem to have potential for amazing cooling and power generation. If it does work, LEDs are hardly the "big" application. They can potentially generate almost as much energy out of a car's exhaust system waste heat as the engine itself already generated! At that point one might wonder why bother with the engine at all, instead of just running a naked flame into a thermoelectric generator. And that situation could come up. And there's plenty of sources of temperature differentials that could be exploited, IF the devices are cheap enough. And quantum well devices are primarily silicon, so they could be cheap, small, and durable.

Re: LEDs waste 75% as heat

The way of knowing out the heat to be dissipated just on paper before installing anything requires to know the exact LER (Luminous Efficacy of Radiation) of the LED. Only way to know it accurately is with an spectrometer, although a digital camera and some software might serve aswell with somewhat lover accuracy. Anyway, the heat load for the heatsink dont need to be excessively accurate, so an approximation is usually enough. For that purpose, using 300 lm/W (optical watt, energy of emitted light) is usually good enough.

I have calculated the LER of several white LEDs and the typical ones (CRI 65-80) usually are around 300lm/W, while high CRI ones have LERs lower. Cool tones often are below 300lm/W, most are 270-290lm/W, so using 280lm/W often works fine. While warm whites often are over 300lm/W, up to 330lm/W, similar to neutrals. Generally, LEDs increases a CCT (K rating of light tone) increases (up to 2700K or so, when it often goes down again, but it depends more of the exact spectrum which achieves that CCT than with other tones). Very cool whites (7500K and higher) can have way lower LER (250lm/W).

But in general, it is possible to use 300lm/W for all and get a decent approximation to the heat load. If you want somewhat more accurate results, use 280lm/W for coolwhites and 310lm/W for warmer tones.

After that, just simply calculate or measure lm output, divide it for watts burned (If*Vf) and divide the obtained figure for the LER, so you get the amount of input energy emitted as light, the rest up the unity being heat.

So for example, a coolwhite LED that emits 420lm and burns 3W (say an XP-G for example, simplifying 3V at 1A), it emits at 420/3= 140lm/W. Using a LER of 280lm/W, 140/280=0.5. So 50% of input energy is converted as light, and the remaining 50% (1.5W) is the heat load.

Or a warm white emitting 300lm with 3W of power: 300/3= 100lm/W. Applying a LER of 310lm/W, 100/310=0.32, thus 32% of input light is converted as light and 68% (2.04W) is wasted as heat.

Other threads cover the topic of knowing the lm emitted, either measuring (integrating sphere) or calculating it (with datasheet, first getting the light output for a given current and the derating it for the LED actual temperature).

Re: LEDs waste 75% as heat

Originally Posted by Kinnza

So for example, a coolwhite LED that emits 420lm and burns 3W (say an XP-G for example, simplifying 3V at 1A), it emits at 420/3= 140lm/W. Using a LER of 280lm/W, 140/280=0.5. So 50% of input energy is converted as light, and the remaining 50% (1.5W) is the heat load.

The amount of power converted to white light is much less than that.

A white LED is a blue emitting LED that is surrounded by a mixture of phosphors that convert the blue photons into other colors to give a white appearing light.

For a moment, ignore the efficiency of the phosphor and look only at the efficiency of the blue LED. For these blue LEDs manufacturers actually publish the output of blue light in watts. For example, a 460 nm LED from LEDengin:http://www.ledengin.com/files/produc...LZ1-00DB00.pdf
Driven at 1000ma it has a forward voltage of 3.6 for a power dissipation of 3.6 watts and produces 900mw of blue photons for an efficiency of 25%.

Now consider an inefficiency of the phosphor in that it is taking a 460nm photon and converting it into a photon with a longer wavelength. This results in a loss because the emitted photon is less energetic than the absorbed photon. If, for example, we assumed that the photons emitted by the phosphor were at 555nm then this conversion efficiency would be 460/555 = 83%.

Multiplying these 2 efficiencies we get an overall efficiency of .25X.83 = 21%

Re: LEDs waste 75% as heat

Originally Posted by David_Campen

The amount of power converted to white light is much less than that.

Given the values stated by Kinzza - his calculations are correct. I do not understand your logic where you then use a less efficient LED Engin product to compare with the XP-G used by Kinzza. That does not seem fair or logical.

The underlying XP-G Royal Blue LED has an efficiency higher than 60% and that a significant percentage of the photons produced by said Royal Blue LED are emitted directly and not subject to phosphor conversion losses?

Re: LEDs waste 75% as heat

Originally Posted by slebans

Given the values stated by Kinzza - his calculations are correct. I do not understand your logic where you then use a less efficient LED Engin product to compare with the XP-G used by Kinzza. That does not seem fair or logical.

The underlying XP-G Royal Blue LED has an efficiency higher than 60% and that a significant percentage of the photons produced by said Royal Blue LED are emitted directly and not subject to phosphor conversion losses?

Just my $.02

Stephen Lebans

The post I was responding too was discussing the efficiency of white LEDs so, yes, there is the phosphor conversion inefficiency to be considered.

Yes, I should have used values from a Cree datasheet. I couldn't find a datasheet for a royal blue XP-G but the data sheet for a Cree XT-E royal blue driven at its rated capacity shows that you get an efficiency of about 40%; multiply that by the phosphor efficiency of 83% and you get about 32% overall.http://www.cree.com/products/pdf/XLampXT-E_ROY.pdf

By driving the Cree XT-E royal blue at 1/3 its rated power then you can get an efficiency of 50%.

Re: LEDs waste 75% as heat

I just used some easy figures in order to illustrate the calculation with an example.

It was not intended as calculation of a real sample, except if you get a true output of 420lm with a power of 3W for a coolwhite. Actually, I don't think any real LED had reached such performance yet, but latest ones are close.

Main difference between theoretical calculations and empirical ones result for underestimate the decreasing of light emission due to heating. That's why I didn't enter into the topic of calculating lm output theoretically with the datasheet. Actually, it is way better to measure the emission with an IS and BTW measure the actual power used, the exact forward voltage and the current.

But if you have accurate figures of true emission and power burned, the method of calculating the heat load by crosschecking with LER is very accurate. More accurate as finest the figures, of course, but thermal load don't need an extra accurate figure, so it works fine. If you are interested on knowing the actual energetic efficiency of a given LED for comparison with other lights, then initial figures needs to be as accurate as possible.

Given the large differences that heatsink positioning, amount of air flow, etc, plays determining the temperature of a heatsink for a given heat load, I would say that trying to measure the heat load by comparing heatsink temperature is not very accurate either. So actually, if you have accurate figures for lm emission and power burned, this theoretical method works probably better. Crosschecking both methods using same LED would be very interesting.

Re: LEDs waste 75% as heat

Originally Posted by David_Campen

The post I was responding too was discussing the efficiency of white LEDs so, yes, there is the phosphor conversion inefficiency to be considered.

Yes, I should have used values from a Cree datasheet. I couldn't find a datasheet for a royal blue XP-G but the data sheet for a Cree XT-E royal blue driven at its rated capacity shows that you get an efficiency of about 40%; multiply that by the phosphor efficiency of 83% and you get about 32% overall.http://www.cree.com/products/pdf/XLampXT-E_ROY.pdf

By driving the Cree XT-E royal blue at 1/3 its rated power then you can get an efficiency of 50%.

It is very difficult to calculate the final efficiency of a white LED still when you know accurately the efficiency of the blue LED inside. It is not so easy as derating the emission for the quantum efficiency of the phosphor, as photons converted have different effect on luminosity, thus LER changes (from 40-75lm/W of a deep blue LED to 300lm/W for a white). Phosphor layer affects too the light extraction efficiency of the package, and different concentration of phosphor lead to very different light scattering.

Probably with remote phosphor technology all those factors are minimized, but I would say such calculation is anyway subjected to too many parameters to give off any accurate figure.

But what is sure that white LEDs with efficiencies of 50% are waiting on the near future and the XT-E is close enough to do it with two bins higher or so.

Top brands top bins of coolwhites are all well over 30%, best ones over 40% when running soft. Due to heating, is very different to compare LEDs working at 350mA or 2.5A. Junction temps varies a lot, thus actual efficiencies does it aswell. In order to see efficiencies of 50% with LEDs running at 1A or over, probably major advances reducing current density droop and improving thermal resistance are required

Re: LEDs waste 75% as heat

Originally Posted by Kinnza

. . . . Given the large differences that heatsink positioning, amount of air flow, etc, plays determining the temperature of a heatsink for a given heat load, I would say that trying to measure the heat load by comparing heatsink temperature is not very accurate either. . . . . .

Re: LEDs waste 75% as heat

Zero, of course.

Thats what I was meaning when I said that LER changes when you apply the phosphor, and that knowing the blue LED efficiency dont allows to calculate final outcome of the white LED.

Im not critiquing your method, Mike. Somebody asked how to know the efficiency for other means so I chimed in and explain the method using measured lumen emission. As it allows to calculate the emitted energy accurately, it gives you the heat load, as energy emitted as light cant go as heat on the heatsink and you cant get more energy output than energy input. I think your way is good to crosscheck results. If any, it could be somewhat improved as others suggested, by isolating well the resistors, placing them glued with thermal adhesive and measure heatsinks temperature well ahead of the heating elements.

Re: LEDs waste 75% as heat

Instead of debating so much about whites, why not testing the resistor setup with a royal blue? It would be a straightforward way of checking the accuracy of the method without messing with the conversion factors of the whites. Once done, you can be sure your testing with whites is accurate. Anyway, I think it is relatively accurate actually, but my experience says thermal energy is tricky to measure

Re: LEDs waste 75% as heat

. . . . Not sure why you used what seemed to be such a different resistor compared to the CREE part. Why not use a good size SMT resistor which would have better equated the packaging? . . . .

Compared with the heatsink, the resistor or LED+star have very small surface area, so they'll have little impact on heat removal. In fact the surface area of the star and the resistor aren't much different.

At these temperatures, there'll be minimal heat removal from radiation.

Re: LEDs waste 75% as heat

Originally Posted by Kinnza

Instead of debating so much about whites, why not testing the resistor setup with a royal blue? It would be a straightforward way of checking the accuracy of the method without messing with the conversion factors of the whites. Once done, you can be sure your testing with whites is accurate. Anyway, I think it is relatively accurate actually, but my experience says thermal energy is tricky to measure

I don't see how this would improve accuracy. The Substitution Method measures the HEAT that ANY object puts out, by substituting an item that puts out an accurately measurable amount of HEAT.

I did the tests with a couple of white LEDs, so it tells me how much heat white LEDs put out.