Best Of

I am looking into the temperature probe discrepancy or if there is one I took this image. On 10 June at 1:45pm. if I recall correctly it was a very hot day that’s what spurred me to take the image of the batteries As you can see by the range chart on the right-hand corner the highest temperature in the image is 88.1° looking back in history for the entire month of June the highest temperature recorded was 32° C which is 89F. In my experience sensors don’t fix themselves. Looking back in the history The highest voltage ever recorded was 33°C So I can only assume that these temperature readings are correct. More questions than answers

Thank-you Mike. The press makes it sound like this is new and unusual. It has been a nasty bout, but one of the main reasons for these fires is people who move into these rural places and expect the fire people to be able to protect them. The fire guys & gals get caught doing structure protection and often can't fight the fires. In the old days there were very few houses away from the big cities.

Much easier to fight fires when not having to save people and their homes.

IMHO (others here with more experience with AGMs may be able to help more), the most likely problem is an overcharge relative to battery temperature. Even in the March V.I. heat, hardened sulfation from undercharging would take a while. Venting gas would be a sudden event.

The root cause of the overcharge could be a bad temp sensor. Although it may have been a transient problem (eg loose connection), it might be worth checking the consistency of currently reported temp values on charging devices vs actual battery temp.

You mentioned "the two sensors" from classics. Mine have a single sensor, and are wired and configured for "follow me", in which when one controller goes to absorb or float, the other follows. There's a single temp sensor, which is read and used for compensation by both. It sounds like each of yours has a separate sensor, and operates independently. As I suspect your voltages would always be compensated down in your climate, a bad or misconfigured sensor on one of the classics could have been the problem.

If you have classic log data from March, it might be interesting to compare logged high battery voltages vs estimated target voltage adjusted for ambient high temps for the same period.

Take a look at the Reason for Resting number. From the main status screen, HOLD down the LEFT-ARROW key and tap the ENTER key... A new menu comes up and there should be a number at the top middle of the screen (and some other numbers). OR, take a picture of it.

I got more from this one response than I have from reading 10+ articles on the subject. Thank you .I think that what I really want after delving into solar is a backup system. With solar to keep the batteries charged.oh yea, I’ve already have energy efficiency appliances, insulated all around, doors and windows upgraded, encapsulated crawl space.

Ignoring the actual system and costs for a moment... Just to take some generic back of the envelope calculations:

3,000 Watt array * 0.52 off grid system eff * 5 hours of sun per day = 7,800 Watt*Hours per day

7.8 kWH per day * $0.20 per kWH hour utility cost = $1.56 worth of utilty power per day

$1.56 per day * 30 days = $46.80 per month power costs (sunny months of the year)

Assume average usage is ~75% (or even as low as 65%) of generated power (you cannot use 100% of off grid energy per day--Some days more sun than you use, other days, you have to cut back or use utility/genset power when the weather is bad/winter).

What is the PV INPUT voltage ? Have you tried going to the TWEAKS menu and forcing a BULK just to see if it comes out of resting ?When the battery gets hot, the set-point voltage goes down even further which might be a part of this.

If you go into the CHARGE menu, them Temp-Comp sub-menu and hit the VIEW key (upper-right or soft-right key), it will show what the present target voltage is... If it is lower than the actual battery voltage then that is all it is.

The INPUT voltage must also be higher than the battery voltage by a certain amount too to start up from resting.

No no other charge sources available. This now would make sense as Float is set @ 13.4v per battery manufacturer spec, and it is also 107F outside. It would be my guess that the temperature compensation is seeing that and backing the voltage down slightly. My bet is if I did the math I am right about where I should be.

Midnite Solar just is not real clear on what device status is/ means in their documentation.

Am having a bit of a problem understanding exactly what you are saying when you said, " ... so looking at it is is reading 60.x watts 0000 watts coming in and resting. Batteries reading 13.2v Should I now not be in float rather than resting ? ..."

One thought, is, when the Classic finishes Absorb, it will enter the Float stage, and that will be displayed for about 90 seconds, but, depending on the amount of load on the system, the battery voltage will remain above the temperature-compensated Float setpoint (Vflt). After those about 90 seconds, if the battery voltage remains above the compensated Vflt, the Classic will "Rest", until the battery voltage descends to the comped Vflt, and again show Float, but with some power being delivered to the batteries and loads.

Do you believe that you might have been looking at the Classic when it was in this stage? The time period for power being produced in Float after exiting Absorb is often minutes, depending on the size of the battery bank, the kind of batteries (Lithium batteries often require a long time to begin producing power, after exiting Absorb), and the loads on the system ...