High-Power LEDs: Hotter Than You Think

When conducting simple experiments relating to a high-power LED array used to illuminate a phosphor-coated sheet to create white with certain properties, Ed Rodriguez stumbled upon something big. His experiment required a CRI of more than 90 and a high-CCT (5,600K) lighting for TV and motion picture studios.

He was operating the array at 25W with a fan-cooled heat sink and LED substrate, both less than 35°C -- the junction was certainly less than 45°C. Yet, when he placed his hand near and then on top of the LED, he was nearly burned. What?

Naturally, Rodriguez repeated the process with a variety of LED arrays rated from 10W to 100W, in several forms, and he used as many individual tests. The results (which he documented on EDN, a sister site) were consistent. They led to multiple anomalies on the way to the smoking gun: High-power LEDs do generate IR. The heating he found in each case was substantial. Plastic melted, water boiled in seconds, and measurements soared during the experiments.

Rodriguez wrote in his EDN post that all materials exhibit varying degrees of radiant-energy-absorption properties that are difficult to quantify.

When the radiant energy is absorbed, it can be manifested as re-radiated IR. Is that heat really there before I put my hand over it? My hand absorbs some of that re-radiated IR and I perceive heat. Same for the evaporated water drop. But that is "real" heat. The revelation here that this heat has absolutely nothing to do with the PN-generated heat... heat that is typically transferred to a heat sink via conduction.

He asserts that LED companies don't have the equipment to perform adequate testing, nor do they perform the types of tests that led to this discovery. Before you ask, it's not the Stokes effect -- the process of converting blue light to white light via blue light absorption in a phosphor causing some heat. This is very real, and it would seem dangerous, depending on the application.

LEDs are new to the market, yet there have been recalls based on heat and fire problems. Rodriguez cited Lighting Science's recall of 554,000 LED bulbs in March because of a fire hazard.

He told us:

LEDs have putted along for over 35 yeas as a little component that provides tiny amounts of light as an indicator or light source in an optocoupler. As slightly larger chips were made and covered with phosphor to make white LEDs, there was no reason to change the thinking as millions of little white lights were sold. It's just in the past few years that chip on board arrays, rated up to 100W, were introduced. What LED manufacturers did not appreciate was that the physics of light and heat have some relationships that do funny things as power increases. Measuring these things is quite complicated. Companies like Fluke, with decades of experience measuring radiated thermal energy understand the issues, but in LED companies, the lack of experience, awareness, and understanding is rather astounding.

Rodriguez feels that this needs to be addressed soon and documented. Acrylic lenses may discolor, temperature measurements will be higher than advertised, and certain materials put in proximity to 100W COB LED surfaces may degrade for no apparent reason.

There is no need to change product approaches, but there is a need to be aware of light-generated radiant heat as a performance factor -- something that most people did not even know existed.

I visited Luminus devices oon 2012 and still have two of their LED modules, blue and red. They do get rather hot, even when powrred with just a 9V battery. They need to be water cooled.

"Test stations also need fixtures, and Luminus engineers must design fixtures that cool the LEDs' 150 W of heat. "That's as much heat as in a professional-grade soldering iron," said Joffe. Furthermore, the LEDs must be tested under consistent current and temperature conditions. The test fixtures are water cooled, and the engineers have designed mechanical fixtures that provide consistent contact with heat sinks."

I thank allof you for you comments. What is clear to me is that there are technical folks out there with far more experience and knowledge than I relative to the physics of high power visible and non visble radiation sources and the heating they can surprisingly create, As one of you essentially stated : "so what's the big deal ,many of this have known those things for years in more advanced radiated light source..."?

But I work in the world of practical commercial LED lighting technologies-with interdisciplines of LED chip mechanisms, phosphors, LED drivers, heat sinking,optics etc. and I can say with certainty, as Keith Dawson alluded to, that 99% of "working stiffs" in the LED world accept as fact that 100% of the meaningful heat associated with LEDs comes from the PN junction ( I've proven hat Stokes effect heating is trivial-- and that the appropriate transfer of that heat is the one and only "thermal issue" in how power led arrays, lamps fixtures. As the former founder/ CEO of both a power semconductor company and of a swithcing power supply company, I am accutely aware of the thermal relationships.

My major (and perhaps only) point here is that 99% of all the folks working in the mainstream LED lighitng worked are totally unaware (or unwilling to talk about) the fact of this emission-side heat , which can be as much as 8% of total power applied. It behooves any maker of luminaire using arrays rated over 25 watts to be aware of any effects on his products performnce or reliability. I came upon this issue because this supposedly non existent heat melted the remote phosphor sheet i had placed over (about .060 away ) blue led array, whose substarte gtempo was only 37 C,.

I think all of you mioght agree this was not "imaginary". That's what started me on this little journey.

What is astounding is the lack of information on this from the leading makers of high power LED arrays, white or color only.

LEDs all produce heat at the PN junction, and those who use them in lighting products understand this well and dissapate the heat through a variety of strategies. The heat Ed discovered has nothing to do with the PN junction. His paper claims that the forward-radiated heat amounts to 8% - 9% of the input power of the LED.

There's something silly about this "new discovery." If the LED draws 25W of power, it will create photons with that much power, minus whatever inefficiencies there are in the electronics of the power supply. So, no matter how you twist the words, the heat this 25W LED will generate in total will be no more than the heat created in total by a 25 watt incandescent bulb. Although the incandescent bulb generates far less visisble light and far more IR radiation. (A 25W LED should generate the equivalent visible light of a 125W incandescent, give or take, with today's LEDs.)

Last I checked, E = hν, where E is energy, h is Planck's constant and ν is the frequency of the photon. This means that all electromagnetic energy, including visible light, carries energy, and IR actually carries less energy than visible light because it has a lower frequency.

Want proof? Buy an IR absorbing filter and put it in front of a Fresnel lens; take the combination outside on a sunny day and catch a leaf on fire.

I've worked around Metal Halide arc lamps for projectors and we always had to be careful of getting anything in the way of the beam when it was near focus, even with an IR filter.

Or how about lasers? By definition they emit only one wavelength, but you can buy visible lasers with which you can solder. (Please don't look at the spot!)

This problem is nothing new to optical engineers who have worked in illumination. Highly concentrated visible light can be dangerous.

I have been working with COB LEDs that are up to 1200 watts and Ed's observations are absolutely correct. And don't forget that ¼ to 1/3rd of all heat comes from the down-conversion process of the phosphors floating in silicone. A bare blue die COB when placed in close proximity to a plastic reflector will melt it; you have to use a metal reflector of some sort for the materials to withstand the absorbed IR. FLIRs give you an idea of what you are dealing with but there are many sources of heat in the system and they are all co-located.

The item does not mention any work done to show the energy came from IR rather than visible. If you put your hand in front of 100W of white light, it will get hot. What was done to show that the burning sensation was caused by IR rather than by visible energy?

My recollection is that the LED chip controllers / circuits also develop considerable heat which explains the need for cooling fins. I recall that being an issue when we tried to develop some LED lighting systems in the lab for machine vision systems. Am I missing something?