Mark wrote:[from a different thread] I've got high hopes for the results that we'll get with the inflection termination method - from all of the charging graphs that I've seen, it should work reliably even on cells that don't show a voltage drop and also at low charging rates.

Would you mind going into a little more detail about the different types of termination methods? I think I've noticed discussion of two so far, the -dV method (which Paul's code uses?) and the inflection method (which it sounds like your code uses). What's the difference? I'm guessing that battery voltages start to decline after being charged for some period of time, and so by looking at the rate of decline you can say "Okay, it's dropping by X volts/tick so it is definitely not charging anymore". The inflection method sounds like it's more along the lines of "Okay, we've gone from increasing voltage/tick to decreasing voltage/tick, so time to stop charging!" Is that correct? Is there more to it?

This is probably best answered by looking at the voltage graph for an Eneloop under charge:

Negative Delta Voltage (-dV) termination is pretty much as you've worked out. The charger is continuously looking for the voltage peak and then comparing the current voltage to that peak. Once the voltage has dropped a certain amount from that peak - generally set somewhere between 1mV and 5mV, the charger takes that as being an indication that the charge is complete. You can see the voltage drop right at the end of the charge. The main disadvantages of -dV termination are that old batteries don't always show a voltage drop and also charging at a low rate also tends to not induce a voltage drop either, so it's prone to missed terminations. It's also harder on the battery since it involves a certain amount of overcharging as can be seen from the temperature line - the temperature starts to rise rapidly as the energy delivered is no longer being stored and is instead being converted to heat.

Zero Delta Voltage (0dV) is similar to Negative Delta Voltage, this method tries to terminate the charge right at the peak of the charge voltage rather than waiting for it to drop. Can be difficult to determine precisely when it occurs due to noisy measurements or if using a low resolution ADC. Paul's method is effectively using this method because it's looking for any voltage drop once some other criteria is met.

The inflection method is a bit more complicated. For this method, the charger is watching the slope of the voltage and when the slope starts to drop off from the peak, it's taken as a sign that the charge is complete. In the above graph, the inflection point occurs at about 1800mAh. I'm going to do it a little differently and stop the charge when the slope drops to below half the maximum. Standard inflection termination tends to undercharge batteries a little, but my modified method will terminate a little later - it should occur just after the temperature has started to increase and so lead to a pretty full battery without any significant overcharge.

There are other termination methods:

dT/dt - in this method, the charger is looking for the rapid temperature rise at the end of the charge. The problem with this method is that high impedance cells may get quite warm during the main part of the charge and then not show too much of a slope at the end - this leads to missed terminations and overcharging the battery.

Max T - The charger can check for a maximum temperature and terminate the charge when detected. This is normally a backup termination method used by a lot of chargers. Pauls code uses this method. Again, high impedance cells can be a problem - they might hit this limit before they're fully charged. If the charge rate is low enough, the ambient temperature is low enough or there is a fan blowing on the battery, this termination criteria may never occur.

Max Voltage. The charger simply stops charging when the voltage reaches a certain level. The Maha C9000 has a maximum voltage termination method set at 1.47V Looking at the above graph, you can see that the Eneloop is well off being fully charged at that point. The C9000 includes a 2 hour 100 mA topping off charge to compensate for this. Disadvantages with this method is that poor condition cells may never reach this level. For good cells, it undercharges the cell and requires a top off charge and that takes time. On Eneloop AA's this topping off charge is often not enough for a full charge still. On Eneloop AAA's it's actually a fair bit of an overcharge, but the rate is fairly low so shouldn't cause too much of a problem.

Max Capacity. This is normally a backup termination method. Relying on this method would result in batteries always being either over or under charged. There's no reliable method to determine the amount of capacity that needs to be put into a cell in advance.

Max Time. A time limit can be set as a backup in case everything else fails. Generally is a problem with overcharging by the time this limit is reached. Dumb timer based charges use this method and rely on the rate being low enough that the overcharge doesn't cause (serious) damage to the battery.

Let me know if I've missed any above or if you would like clarification on any of this!

Edit: I'll include a zoomed in image with the relevant points highlighted:

if you could confirm - in summary to your post about pauls original 2xAA ultrasmartcharger (USC), it uses the following algorithm;
1. altered inflection method for its main detection
2. Max Temp as a backup cut out.
noting;
- no max timeout cutoff
- no max capacity cutoff

will this be the same on the coming LCD version? will the detection methods be able to be altered in GUI? will the inflection slope be able to be altered?

The charger doesn't use a timer cut off at all, but it does have a capacity limit as a backup.

This is the same as on the LCD charger - the actual code for handling the termination is exactly the same between chargers.

On both chargers, the capacity limit can be changed - through the LCD interface on the LCD charger, and through the configuration utility for the non LCD charger.

There isn't any provision in the code at the moment to handle charging at low temperatures. It could be added in, but it never gets cold enough here that I could test it. How many people allow the temperature inside their home to drop below 5 degrees C?

Keep in mind that the act of charging NiMH cells warms them up, so the ambient temperature would need to be significantly below 5 degrees for it to be a serious problem. With the inflection method of termination, the charger should be stopping the charge before oxygen and hydrogen gas is being formed, so that's also another reason why I wouldn't be too worried.

while charging them would warm them, apparently they need to be need to be limited to 0.3C to 0.1C as but the time inflection is reached the cell would have already died. for as much as one should trust wikipedia - "when charging below 5 °C (41 °F), the ability to recombine oxygen and hydrogen diminishes. If NiCd and NiMH are charged too rapidly, pressure builds up in the cell that will lead to venting" reference http://batteryuniversity.com/learn/arti ... mperatures for more details

re temps ipswich/australia will reach 3*C mon-wed, overnight in someones shed it could reach this?
not concerned myself - just curious for comparison and how "ultrasmart" it is. at these temps i would be more worried about condensation on the charger shorting something out.

so to re-summarize, all USC do the following for charge completion cutoff;
1. altered "inflection method" for its main cutoff detection
2. Max Temp as a backup cutoff.
3. Max capacity as secondary backup cutoff

- no trickle top off
- no max timeout cutoff (capacity is a better substitute ie total current over time)
- no low temp compensation (this is not an outdoor device, shouldnt ever be an issue)
- operating at cell temps 0C-5C current should be manually reduced to 0.3C
- operating at cell temps -18C-0C current should be manually reduced to 0.1C (probably shouldnt be left unattended)

2114L3 wrote:while charging them would warm them, apparently they need to be need to be limited to 0.3C to 0.1C as but the time inflection is reached the cell would have already died.

According to the linked page, the problem is that Oxygen/Hydrogen recombination is slowed at lower temperatures. Normally, Oxygen and Hydrogen aren't produced until the cell is being overcharged - it's not entirely clear from that page whether charging at lower temperatures causes gas production before the cell is full - have you seen anything to confirm this one way or the other?

Keep in mind that at lower temperatures, internal resistance increases - if it increases enough, the charger will automatically reduce the charging current.

The predicted minimum is 3°C. Even inside a shed that doesn't have any form of insulation, I think that it would be reasonable to expect the inside temperature to stay somewhat higher than this. If you look at the graph at the top of this forum post, you can see that it doesn't take long for the cell temperature to increase by 5°C so even under such circumstances, I wouldn't expect the cell to be below 10°C for very long...

not concerned myself - just curious for comparison and how "ultrasmart" it is. at these temps i would be more worried about condensation on the charger shorting something out.

I suppose that condensation could be a problem as well. As a general rule, it's best not to do a lot of things at extreme temperatures and charging batteries are just one of them. At least NiMH is a lot more forgiving of charging at lower temperatures than Lithium Ion cells!

so to re-summarize, all USC do the following for charge completion cutoff;
1. altered "inflection method" for its main cutoff detection
2. Max Temp as a backup cutoff.
3. Max capacity as secondary backup cutoff

- no trickle top off
- no max timeout cutoff (capacity is a better substitute ie total current over time)
- no low temp compensation (this is not an outdoor device, shouldnt ever be an issue)
- operating at cell temps 0C-5C current should be manually reduced to 0.3C
- operating at cell temps -18C-0C current should be manually reduced to 0.1C (probably shouldnt be left unattended)

can't wait to buy a LCD version.

That's pretty much it.

You can set it to do a fixed current top off for a certain capacity - this isn't normally enabled for regular charging but is enabled for the test modes so as to get the maximum capacity result from the test - default is a 200 mAh top off at 200 mA. Once the top off is complete, no further charging takes place after that.

If the ambient temperature is going to be below 5°C I'd simply recommend not using the charger unless absolutely required - this shouldn't be a problem for most people unless they're living in an igloo!

As i understand it (in a totally unprofessional and unqualified position) - the chemical reaction of charging NiMH is when hydrogen pushed into the electrolyte when an electron is pulled through the charger.
this hydrogen is turned to H2O in the electrolyte, then Hydrogen broken back off at the positive electrode to complete the reaction. It sounded like the low temps could/would slow the hydrogen + hydroxide reaction into the electrolyte and leave the hydrogen free to build up pressure if the charge current is too high.

BUT i agree with you that a hydrogen and oxygen "re-combination" process is only present in an overcharge situation.
I am unsure if the charge process itself is actually effected by low temps.

2114L3 wrote:As i understand it (in a totally unprofessional and unqualified position) - the chemical reaction of charging NiMH is when hydrogen pushed into the electrolyte when an electron is pulled through the charger.
this hydrogen is turned to H2O in the electrolyte, then Hydrogen broken back off at the positive electrode to complete the reaction. It sounded like the low temps could/would slow the hydrogen + hydroxide reaction into the electrolyte and leave the hydrogen free to build up pressure if the charge current is too high.

You could be right there.

BUT i agree with you that a hydrogen and oxygen "re-combination" process is only present in an overcharge situation.
I am unsure if the charge process itself is actually effected by low temps.

A quick scan of that page looks interesting - I'll have to go back and take a more detailed look later when I've got more time.

thank you for indulging my curiosity.

No problem - thanks for raising some interesting questions.

I'm thinking that it might be interesting to stick a discharged Eneloop in the freezer overnight and see what happens when trying to charge it normally...

It's probably not something that needs to be too concerned about, but if there is anyone that is worried that their charger/batteries are likely to be exposed to temperatures below 10°C when charging, please let me know and I'll look at adding in appropriate checks and adjustments as required.

dT/dt - in this method, the charger is looking for the rapid temperature rise at the end of the charge. The problem with this method is that high impedance cells may get quite warm during the main part of the charge and then not show too much of a slope at the end - this leads to missed terminations and overcharging the battery.

I just wanted to add to the discussion that the dT/dt is probably the least fortunate method to implement in a powerful multi-bay universal charger. Powerful chargers work with high charge and high discharge rates, generating heat both in the batteries and in the inside of the charger. To keep the temperatures of the electronic parts and components at an acceptable level the device would come with a heat sink and also a ventilation fan. All powerful hobby chargers and also some notable multi-bay universal chargers (NC2500, BT-C3100, MC3000, a.o.) have an internal fan. I've been recording the temperature graphs of the batteries in such a fan-operated device and one can easily see the influence of the fan on the temperatures. Battery temperature depends on the environmental air temperature, the cell internal heat generation, the two cold/warm/hot anode|cathode metal contacts, and on the radiative heat transfer from hot emitting sources, and finally the hopefully unhindered heat conduction to the temperature sensor. A battery surrounded by hot batteries will pick up some of their emitting heat, or fan-cooled internals will also indirectly reduce the temperature of the batteries in the bay through the cooled off metal contacts. And in general, to maintain the cell health, a charger should always try to keep the temperatures low by removing the cell heat as effectively as possible. Therefore in practice the dT/dt method is impossible to model, if the programmer tried to take all physical effects, side effects and potential influences into account. For a single-channel NiMH charger without bay and without automatic fan, like a typical hobby charger in a dark lab with constant room temperature, the dT/dt method would have a place. But with an integrated bay and 2+ channels the whole method becomes a complete mess and not reliable in the end.

See for example the below 4-set of graphs representing the four battery slots of MC3000. Slot#1 was occupied with a 1.2V-1s3p round 3AA-to-D-size parallel adapter containing three Eneloop AA's, slot#3 the same, whereas slot#2 and slot#4 were loaded with 1.2V-1s4p round 4AAA-to-C-size parallel adapters both containing four Eneloop AAA's. Yes, basically i was charging 14 Eneloops at the same time, with a massive 3.0A per slot! Anyhow, the effect of 'neighboring influences' on the external temperature sensors can be seen very well: slot#3 gets hotter than slot#1 because it is surrounded by slot#2 and slot#4, and as soon as the 8 AAA's have finished charging and begin to cool off, slot#1 and slot#3 quickly drop in temperature too. Obviously, most of the temperature (sensor) rise in slot#1 and #3 was attributed to the emitting heat of the neighboring compact/packed 4xAAA's. One can see how the temperatures rise after ~1h57min, near the end of the charge termination, marking a local minimum in the curves.
When i did this test run, i didn't have the temperatures in mind, actually. The original purpose of the test run was to demonstrate that it is possible to charge (equally conditioned) Eneloops in a parallel configuration without problems and that the MC3000 can do so with up to 1C per cell when charging 16 AAA's simultaneously, or up to 0.5C/AA with 12 cells, or as in this example 8 AAA's and 6 AA's. The use of the dimensionally different sized adapters, a densely packed C-size adapter vs. a spaciously packed D-size adapter, only intensified the adverse effects of heating, heat accumulation, heat emission vs. fast cooling: an extra variable which entered the physics in our example are the plastic adapters themselves! The D-size adapter has thick plastic walls at some spots and they definitely reduce the amount of cell heat reaching the temperature sensors.

This example also illustrates that it would be negligent and imho wrong (for a post writer or a charger reviewer) to have an isolated look at a single slot and its performance, or even worse, to run a test in just 1 slot and leave all other three slots empty. Well, one could do the latter. But then you would never know whether and to which extent running charge/discharge programs in the other slots at the same time affected your original temperature measurements. There you have it, battery temperature is influenced by too many variables, it is too sensitive a physical quantity to build one's NiMH termination method upon.