View Single Post
Old 02-25-2009, 03:17 PM   #38 (permalink)
jyanof
Joe
 
Join Date: Feb 2009
Location: phx
Posts: 260
Thanks: 0
Thanked 48 Times in 38 Posts
Quote:
Originally Posted by MazdaMatt View Post
I would definately keep the current sensing to the battery line. I don't understand what a 50mv shunt is? ...

Total system design should take into account a maximum power and the input ratings should be specified accordingly.
Oh, maybe better terminology would be a 50V 50mv shunt, which would be a shunt of 0.001 Ohm resistance. (My electrical engineering greenness is showing through)

As for input ratings, I think versatility of a charger is important. My goal would be able to plug in at home on my 240 30A line, then drive to my parent's place and charge off their 120V 20A line, and make a trip by the school and plug in to the 120V 10A outlet. My current problem (pun intended), is that the charger I have now only runs off of a 240V 30A outlet, so I can pretty much only charge at home.

Thus, the need to adjust the input power (current). I guess there's a number of ways to do this. (thinking out loud again...) Probably the most time efficient method I can think of would be:

1. Determine a max battery current based on the charger components. I'd probably conservatively set it at 20-25A for the ones in the schematic.
2. Set the max input current from the power source
3. Monitor both currents and maximize until one or the other reaches its respective limit.

From a 240V 30A source, the battery current would max out first at 20A or whatever, while the source current would max out first from a 120V 10A outlet.

This would of course only be during the bulk charge phase, but would allow for the quickest possible charge.

I guess it isn't necessary. The user could just select the battery amps knowing that the setting would have to be a little below the power source max. But, i think it'd be easy to do, so why not?
  Reply With Quote