Ok, so I am a little stuck on one thing for a while now. For a nickel-iron battery bank of 24 volts, each cell is 1.2 volts and has a charging voltage of 1.65, so 33 volts is needed when for optimum charge. The 24 volt, 100Ah battery bank requires a optimal charge rate at c/4 (25 amps for a 100 Ah battery).
My question is this; If I have 3 solar panels rated at 50voc and 10isc that produce 400 watts each, and line them in a parallel circuit, I will be pushing 30 amps and 50 volts to my charge controller.
Once in my charge controller, does that 30amps and 50volts translate into enough energy to charge my battery bank in 4 hours? Or is the 30amp and 50 volts converted to watts to charge my battery bank, in which case I would be charging a 5kwh battery bank with only 1200 watts from the panels.
I would greatly appreciate clearing this up as it is hindering my further understanding of these systems.
My question is this; If I have 3 solar panels rated at 50voc and 10isc that produce 400 watts each, and line them in a parallel circuit, I will be pushing 30 amps and 50 volts to my charge controller.
Once in my charge controller, does that 30amps and 50volts translate into enough energy to charge my battery bank in 4 hours? Or is the 30amp and 50 volts converted to watts to charge my battery bank, in which case I would be charging a 5kwh battery bank with only 1200 watts from the panels.
I would greatly appreciate clearing this up as it is hindering my further understanding of these systems.