Quote:
Originally Posted by hayden55
But typically yes less amps will be more efficient if ambient temp being too cold is not a factor.
But yeah somebody try differing your amps on your 240v charger and note the efficiency change. The longer charge should be more efficient and it is better for battery longevity. Fast charging reduces cycle lifes available a lot more.
|
You're conceptualizing this as a battery charging scenario, and it's not. It's charging an EV.
The longevity argument doesn't apply to charging an EV on L2 because most batteries are rated for around 1C charging rate. In something like the Chevy Bolt, that would be 60 kW. The charger on the Bolt is limited to 7.7 kW. That's like 0.12C, or around 10% of what the cells are actually rated for. You're not going to see any difference in longevity charging at 0.1C or 0.02C. At both rates, the charging will taper to the same rates towards the end of charging, where it matters most.
The Bolt conditions the battery to ~70 F degrees while L2 charging. If it's too hot, it cools it, and if it's too cool, it warms it. It maintains that temperature throughout the charge, so the longer you take, the more energy will go towards maintaining temperature. A coolant pump will be run this whole time.
All EVs trickle charge the 12v battery during charging. That means the DC/DC converter is active during charging. The longer you take to charge an EV, the longer the 12v battery will trickle charge and the longer the DC/DC converter will be active.
My Prius ran a cooling fan for the duration of the charge. The longer the charge, the more energy was lost driving that fan.
Regarding transformers, they're like 98% efficient. It might account for 2 of the 20% overall charging loss.
The way I measured was to take the difference between what I measured at the wall and the amount of charge reported by the car.