Meta

LED logger

Why?

About half a year ago I installed a 3 meter warm white led strip above my desk. Although I didn’t spend a lot of money on it I am really happy with the light quality and intensity. Lately I was thinking about adding more LED strips and was checking the options. One important aspect that comes to mind is about the lifetime of these products, are they better than bulbs, overall? Obviously there’s no specification you can trust anywhere. So, how do I know if cheap strips are good enough or it is worth spending the money on more expensive ones? Measure some data for the cheap ones and see if they are good enough. The experiment was designed to be simple, not extremely accurate.

Update:

The measuring method has been criticized, and some of the claims are fair: there’s no control and the sensor may degrade. I have stopped the experiment after 1400 hours to upgrade it. But first, i replaced the measured strip with an identical, unused one and the sensor reads exactly the same as in the beginning. So, there is NO sensor degradation. Later, when the measurements will reach the 70% threshold i will replace the strip with another identical unused one and compare the sensor readout. This should eliminate the sensor variable and provide control.See new experiment, LED logger v2.

The Logger

I setup a very simple device, for first measurement attempt: a metal project box houses 15 cm of unused strip that shines light on a TLS2550 sensor, everything being isolated from ambient light. The strip is placed on the aluminum front panel of the box, providing a great amount of cooling for the strip, much more than real life applications where it could be mounted on wood or walls. The box is sealed so there’s no ambient light going in of dust.

This sensor is really great: it has I2C interface, high dynamic range and approximates the human eye response, giving the result directly in lux (after some math). I’ve used a microcontroller and an EEPROM memory with a couple of years of space, storing data every 6 hours. There’s a serial port that provides a simple interface with an instant read, memory erase and memory dump. Current is measured over a 2 ohm resistor and stored as ADC counts. The used board is Arduino compatible, but it was not used with the Arduino environment, just your regular C.

Results:

The light has been on continuously for about 700 hours, almost a month 1400 hours, two months.. The graph shown below will be updated periodically. So far, the LED strip has dropped about 12% 18% in brightness compared to the beginning. Current has stayed the same, meaning that they are actually reducing their efficiency.

Although it is too soon to tell exactly I decided to try to predict the life of the LEDs. Since they don’t burn but rather fade, I think that the point where they dropped to 70% of initial intensity is a good mark of their end of life in applications where they are used for lighting and not decoration.

Using an exponential projection I found that it will take just 2200 hours until the LEDs drop to 70% of their initial intensity. This is rather disappointing, even for cheap LEDs. In about 2-3 more months I should know for sure how good this initial prediction is, but I doubt it will be too far from the truth. UPDATE: after about 1400 hours of use, the prediction still stands, the strip will fall to 70% around 2200 hours.

How bright is a LED strip?

A while ago I made an experiment which tried to find out how much light a 3m led strip could give. Obviously lacking any specialized equipment I tried to compare it to some references. I used a room and cut the strip in 50 cm pieces and grouped them on the ceiling. I used my camera to determine the correct exposure of the whole room. Next I proceeded to replace the LED strip with bulbs that could give me the same exposure. I found that I need a 25W and a 40W incandescent to arrive to this. According to the bulb packages they each give 200 and 400 lm, totaling 600 lumens.

Using my modified power meter I found that the strip consumes 14.3W from the mains. This means that the LEDs are about 42 lumen/watt, which, as an idea, is comparable to the worst CFLs and is still rather poor for LEDs. But, do remember that this takes into account the PSU, which is the right way to evaluate for practical reasons.

LED strip vs incandescent: Preliminary conclusion

Based on the price I paid for the LED strip (~4.5 EUR/m), local electricity price of about 0. 09 EUR/KWh and bulb price, but excluding the LED PSU and workmanship for installing everything it takes 2800 hours for the LEDs to become cheaper than the incandescent.

Since my strip is not considered useful after 2200 hours as it becomes too dim, it means that it is not a cost effective solution for illumination, compared to incandescent bulbs. LEDs are supposed to save energy and money. It appease that low cost strips are not a good way for that.

What about more expensive LED strips? I used the same thinking as above and found more expensive strips with Nichia LEDs require about 7000 hours before they become cheaper than incandescent, again excluding the PSU. This is a rather rough estimation considering the brightness and efficiency specs are right. Again, the manufacturer of the strips doesn’t give any data about the intensity decay, so the cost effectiveness of the solution is still unknown.

Download

The code for the logger is available for download here. It’s free for non commercial use and rather unpolished.

Old post, I know, but just saw it. No surprise here. White LEDs usually use a phosphor or mix of phosphors to make white light from blue. The LED degradation is from both heat (the heat production is in a very small volume, so the temp of the die may 10 or 20 deg C higher than the case temp, which is a factor of 2 to 4 or more in diffusion related degradation over what would be seen at case temp), and degradation of the phosphors, which degrade just like the phosphors in an old CRT with burnt in images. Phosphor deg causes color change as well as output change, and the degradation of the die produces efficiency drop with little change in color.

Unless you need to provide a dimming feature on the LED strips, I would suggest that you try to use a high frequency switching power supply with a low duty cycle to feed the LED. Apparently, some long life traffic light modules use a switching power supply to extend the LED life to more than 5 years before the LED dims to 70%. I also believe that the power supply circuit in these modules slightly overdrives the current into the LEDs. Maybe try overdriving the LED current (to 120%) while limiting the duty cycle to 20% ON. No guarantees – just offering a suggestion!

Kudos for the idea of testing LED sustainability. What is your formula for calculating the break-even point of replacing incandescent with LED? Your local electricity price is very low and I bet it is an important factor in this calculation. If you give the formula, the reader can ‘correct’ for local electricity price.

To calculate the cost of incandescent i use:
time * lamp_cost/1000 + lamp_power/1000 * energy_price * time
For leds:
led_cost + led_power/1000 * energy_price * time
You may also directly plot them and see where they intersect, or just solve for time where the two are equal.
time is in hours
lamp and led power is in watts
energy price is for KWh

I believe the data may be useful, but i highly doubt sing makers will use cheap LEDs, changing then often is not practical and signs also operate long hours. So there probably isn’t that much useful data there.

Not to mention that i have no model number or datasheet for my strips.

Very interesting, I have a 5 meter cool white 5050 LED light strip in this room I’ve been using for many hours a day for a few years (3+, forget how many). The light output of it is now terrible, and many of the LEDs have died, probably because I didn’t stick it to anything but just left it hanging in the air with the LEDs pointing towards the ceiling for a diffused light setup, it gets quite warm.

I recently bought a 5 meter cool white 5050 LED strip and a 5 meter warm white 5050 LED strip, the brightness of just 1 meter completely outshines the old worn out 5m strip. I bought both types because the combined light colour is very nice and will be able to control the brightness of each through a custom remote controlled PWM setup, and with double the amount of LEDs I can run them at a lower brightness for a longer life (plus they’ll be stuck to some aluminium angle in 1 meter lengths for heat dissipation).

BTW, if you try this test again with a new batch of LEDs how about putting 2 strips in the sealed box, one as the test subject and the other as the control.

The test strip stays on all the time and the control strip gets turned on once a day for a short burst to measure its brightness against the control strip (which gets turned off briefly while the control strip is on).

That way if the light sensor does degrade over time you can compensate for this by using the data from the control LED strip.

What about dust accumulating on LEDs and/or sensor as an explanation for intensity decay? LEDs have small DC potential over them, which will attract small amounts of dust. Same for sensor.

There could be an explanation for theory of “cheap LED fading” – for example, some cheap LEDs are known to have yellow phosphor in a form of emulsion in transparent polymer (search wikipedia for “white LED” for explanation of what yellow phosphor is doing there. So… it is possible that this transparent polymer would become progressively more opaque under heat and radiation. But I still doubt that this is happening _that_ fast. Let’s wait for at least half-a-year result graph.

It’s not dust. Some sources claim that the UV from the chip changes the epoxy and makes it absorb light. Whatever the cause, it happens. My lifetime tests agree with the authors: the white LEDs from most Asian (no-name) makers go dim after a thousand or 1500 hours. Philips and Nichia are the exceptions, and will last ten thousand or more hours. You buy inexpensive white LEDs and consequently you get short lifetime. They should require LED makers to supply them in packages that will fit into a lamp socket. That way you can easily change them after they go dim.

Cool post, but I didn’t see anywhere if you are derating the LED strips. If you are running them at the maximum ratings, they will not last long. Most semiconductors should be derated to ~70% of their maximum power for a much longer life. Also how stable is your power supply? Line and load regulation? If you are running at maximum power and your supply is going above that, you will damage the LEDs quickly.

I don’t know how to interpret this: if i would use bare leds, then yes, i should use them somewhere below 100%. But i am using a LED strip which incorporates leds and which is designed to work at 12V using standard 12V supplies. The manufacturer is the one who should have taken care so that the LEDs are ran below max.

I checked the specs for the light sensor. It’s a weird little thing. The output is packed into eight bits where the MSB four represent “octades”, or “chords” and the last four are linear steps within the octade. I am assuming (because I couldn’t find it in the spec) that using their formula produces a number that represents lux. If so then you are measuring about 700 lux and that means the step size in that octade is 32 lux. Your graph shows a step size of over fifty in the downward direction and perhaps twenty lux upward. Something’s not right there.

This sensor is a coarse resolution measuring device – it has to be so it can cover the broad range of light intensities for which it is designed. I think that explains the step changes in the graph. However it does not explain the size of the step changes or the small upward change between 370 and 450 hours.

The compander scheme they use is highly inaccurate because the actual step size within an octade should be derived from a lookup table that mimics the step sizes one would see on an octal log scale. The first step should be almost half the full value and the step from 6 to 7 would only be ten percent. Instead they just hack a linear step size into a device because it’s not meant for any kind of precision measurement. These are designed to provide the input needed to raise and lower LED drive currents on LCD displays and for that the scheme is fine.

One other thing to consider is that the eye’s response is logarithmic so a drop of fifty percent in light output energy is only a 3 db difference and that’s three “Just Noticeable Differences” as far as the eye is concerned. I’m assuming the response of the eye is similar to the response of the ear to energy changes. I would probably not even notice a twenty percent change in light output, let alone irritating me enough to replace the strip.

FOr that reason I would double the useful hours for the LED strip. Furthermore, it is easy to tack up a second strip when the first dims and leave the first one in place. This would make my work area brighter and after another four thousand hours I would still have the brightness of my original single strip when it was new. Incandesent lamps blow out and have to be replaced so that has to be factored into the cost.

I’m also not sure that the decline in brightness is linear over time. It may decline for awhile and then hold steady. Linearity is the bugaboo of human thought. Nature doesn’t use it all that much – she prefers exponential and log processes. It’s quite possible that the dimming of the LED strip halves over a given period. This means that it would be down to half power in 8000 hours vs: the 4000 hours implied by a linear decline.

Would you be willing to publish the set of raw eight bit values from the light sensor as a .csv file? I’d love to have a look at them. I am using a pSoc 5 evaluation board to measure the current output from my PV panels and I’d like to sense the level of the sunlight using a second device – this little guy might do the job. Having your numbers to work with would help. So far I’m kind of iffie on it. It seems like a rather half-baked spec sheet – there is no mention of change in output vs: temperature. I suppose I can shine a light on it while I toast it with my SMD hot air wand.

You have raised some good points. I suspect there is a problem in how the micro calculates the output in ‘lux’. Ill dump the raw data as soon as I can and publish it.
I believe that the sensor is not good enough for sunlight, it might saturate. There’s no working range given in the datasheet.
I considered the sensor to be good enough for detecting the 70% threshold. The 70% threshold is part of an “official” rule for measuring led life, LM70. There are others as the usage of the LEDs dictates how dim they can be before considered useless.

The 70% figure is, no doubt, published by those who stand to profit from replacement. Try lowering the drive current from 100 counts to 75 counts (that will halve the power) and see how much difference it makes in your setup.

A neutral density filter would solve any saturation problems I think. But the specs for output response show a deviation that varies from 50% to 150% – this thing isn’t very accurate. Another factor is that I believe the solar panels are most sensitive to IR and this chip is modified to approximate the response of the eye. I’ll probably use something else. The idea of having a chip that just runs off of four lines with a digital output was pretty attractive.

Glad you took the time to make these measurements – that’s the kind of thinking we need nowadays. Take nothing for granted.

I understand there are some studies behind that 70% figure, but i’m sure it is application dependent. The sensor is indeed not a precise one: but my thought was that i don’t need one since i am not interested in absolute precision but in determining how long it takes to drop to some level.
I think that for your application you could use something different. I think it’s hard to find a sensor that has good absolute precision. Maybe you could use a cheap sensor but calibrate it with some reference. You would also need a different type as solar panels have a different sensitivity compared to the eye.
Also see: http://cool.conservation-us.org/byorg/us-doe/luminaire_reliability.pdf

How about the explanation that the sensor is degrading? I seriously doubt your LEDs are degrading, since that would presumably mean your LED strip came straight out of factory (the only mechanism by which PN junctions can change is by impurity diffusion, which does not depend on the current through the junction). I do not believe your results.

Further, what did you use to plot that graph? Even though I think seeing steps is kinda weird (I would expect a continuous dropoff), you seem to have a Gibbs phenomenon, which would be much, much more interesting than the degrading of your LEDs.

Ergo, not only do I not believe your LEDs are degrading, I think you made an incorrect plot of your data.

PS, I just saw your comment on Hackaday in which you ask “Why would it ? There’s nothing that should degrade over time as there is in LEDs..”

If you believe that, I suggest you learn what the sensor is made of. It’s exactly the LED, but used in reverse. Read the section “Application Information” (as you should) in the datasheet – it says that the photodiodes are silicon-based. Not only that, what you are reading out is the value of one photodiode minus another – you have 2 (!) sources of error. Replace the sensor before you make any more claims about degrading of the LEDs.

Okay, I’ve thought about this, and what you could do instead is use, for example, a webcam to measure the light intensity. Admittedly its resolution may not be as good as this one, but at least it has a more predictable flux-output relation. It’s also a low-budget device and relatively easy to use.

Ok, the web cam is a valid point, but i would need one that supports manual control, otherwise it will use auto brightness. This kind of complicates things since i would need to check it periodically and measure myself.

Why i think that the leds are the one degrating and not the sensor: the leds are under more stress than the sensor. Power density is larger plus there’s also the phosphor that degrades. On the other hand, the stress on the sensor seems a lot lower.

Many (I want to say “most”, but I simply don’t know) webcams have a (soft) switch for brightness control.

To say either component is under some kind of “stress” completely misses the point of electronics. A CPU under 100% load does not die more quickly than one under 10% – this is not how semiconductors work. The strength of electronics is that they are very load-invariant, as opposed to mechanical systems. And even then you’re not quite right: iirc a cannon gets harder with every shot.

Degrading of the phosphors is a possibility, but they are rather stable chemical compounds so this would be on an entirely different order of magnitude – again, if it were the cause it would be a breakthrough.

Either way, I think it’s safe to say you made many premature claims, which you’ll have to back up if you want to be taken seriously.

You are right about that, i will have to double check the circuit. I understand having some measurement noise, but the light should not drop.
I’m using the included floating point library to calculate the intensity in lux, which i understand it has some problems. I’ll dump the raw data from the sensor(this is how it is stored in the memory) and calculate it using the PC.

Something seems odd about your power figures. You said elsewhere that each group of 3 LEDs in series draws about 13mA so the total current should be about 39mA. The power actually consumed by the LEDs would be 39mA * 12V = 468mW. If your power supply is actually drawing 14.3W from the mains then it is horribly inefficient. That’s OK for a bench supply but not a fair comparison of power efficiency.

Also, don’t let excel use a spline to connect the measured data points. That’s why you get the funny bumps around each change in light level.

When I see the graph, I hav second thoughts about your measurements.Can you explain the sudden jumps in intensity?
If the graph were showing agradual decay i’d agree with your conclusions.
Perhaps other environmental parameters should be logged as well :temperature for instance.