Well, there's a lot more to it, and I only really feel comfortable explaining the basics.
Li cells tend to be 3.4-3.7 volts. I've seen the numbers for max, min, and nominal voltages vary.
If you run them in series, the voltage increases, but the amp hours stay the same. If you run them in series, the voltage stays the same, but the amp hours increase.
The amp hour numbers can be very deceptive because they can vary based on discharge rate, and FLA/AGM batteries do not seem to have a consistent standard for explaining what their claimed amp hour ratings are based on. Essentially, the faster you drain the battery, the less overall power you'll get out of it.
Since most of us are trying to determine the actual *work* we will be getting out of the battery, I prefer to look at the W/KW hours. To determine the Wh in a battery, just multiply the voltage (e.g., 12V) by the amp hours (e.g., 100 Ah). Caveat: See above regarding the varying standards for Ah ratings. Now you have the watt or kilowatt hours of energy in the battery. Of course, you don't want to exceed the depth of discharge of the battery, so you will want to figure out what percentage of that energy is actually available (i.e., < 50% DoD for FLA/AGM and < 80% DoD for Li).
Also, this is kind of working backward. You'll need those calculations, but the first thing you'll want is how many Wh it is actually taking you to run your car. For instance, if you are spending 100 Wh/mile (just pulling that out of a hat), a battery that provides you with 1 Ah of useable electricity is only going to get you 10 miles before you start to risk seriously damaging the battery or having to stop driving. So the first step really should be determining how much electricity it is taking you to run your car.
Also, on a side note: I only chose 100 Ah because a) it was the size battery I was considering when I was thinking about building an EV (on the shelf right now) and b) it seemed like a nice, round number to do math with. :-P
__________________
|