I use smart chargers, and the basic design is each time it hits a set voltage the charge rate drops, so starts at 3.8 amp, then 3 amp, then 0.8 amp and finally 0.1 amp, however if the voltage drops below 12.8 volt on the 0.1 amp charge rate it will return to 0.8 amp charge rate, some times it has returned to 0.8 amp, but normally if left even at 0.1 amp the voltage will slowly raise, so once at 13 volt I considered fully charged, and charger moved to next battery, the Honda Jazz and Kia Sorento batteries were both put on charge at the same time, I use the Honda engine compartment to protect the two extension leads from weather as not long enough to reach Kia with one lead.
So Honda has charged up and at 0.1 amp reached 13.8 volt and was removed from charge, but the Kia is alternating between the 0.8 and 0.1 charge rate. the patten after finishing 3 amp rate, and the pattern some days latter, now I can understand why the time at 0.8 amp is shorter, as it gets closer to fully charged then I expect the time at 0.8 amp to reduce, but why is the voltage dropping faster?
It is a 95 Ah battery still connected to vehicle, and it had been allowed to self discharge to under 12 volt, Did not try to start, knew would not be using for some time, so wanted to top up before winter set in, it was taken for a run around a month ago and I have no reason to suspect any fault. It is purely interest, there is no known problem, without having an energy monitor I would not even know what was happening, and would have taken off charge, as an auto electrician I did not have the ability to monitor charge rates like this, it is pure interest, not a problem, but why is the voltage decay time decreasing?
So Honda has charged up and at 0.1 amp reached 13.8 volt and was removed from charge, but the Kia is alternating between the 0.8 and 0.1 charge rate. the patten after finishing 3 amp rate, and the pattern some days latter, now I can understand why the time at 0.8 amp is shorter, as it gets closer to fully charged then I expect the time at 0.8 amp to reduce, but why is the voltage dropping faster?
It is a 95 Ah battery still connected to vehicle, and it had been allowed to self discharge to under 12 volt, Did not try to start, knew would not be using for some time, so wanted to top up before winter set in, it was taken for a run around a month ago and I have no reason to suspect any fault. It is purely interest, there is no known problem, without having an energy monitor I would not even know what was happening, and would have taken off charge, as an auto electrician I did not have the ability to monitor charge rates like this, it is pure interest, not a problem, but why is the voltage decay time decreasing?