Quote:
Originally Posted by 2000mc
If you supply a battery with a constant 13.5v (or whatever voltage you typically see in your car while maintaining highway speeds) for an unlimited amount of time, what is the lowest the amperage the battery falls to?
I'm wondering if it could be a couple amps for a lead acid, just slightly less for an agm, but for capacitors or lifepo4 batteries that 13.5 wouldn't be near as high state of charge, would it be near 0?
|
I'm not sure I followed your whole train of thought, but a capacitor that is given a sustained 13.5v will take on charge very rapidly until it approaches 13.5v. As it gets ever closer to 13.5v, the current drops off until there is practically zero current flow. Therefore, a capacitor is very efficient in that it no longer consumes energy when it has reached equilibrium voltage with the alternator.
Am I on the right track with your question?
I'm curious if a lead acid battery dissipates energy even though it is already at full charge. If it does, I wonder how that compares to a LiFePO4 battery considering lithium chemistries require current flow to cease when full charge is reached.
I just got 2 ammeters with logging capability, so I can begin to compare how much energy goes into a battery vs how much goes back out.
The only problem is that the meters are rated at 100A, and measuring the starter would blow them. Starting is the biggest single drain on a battery, so it would be important to capture the data. Once a car is running, the battery rarely supplies power to anything, and when it does, it's for a brief moment.