I do and don't agree. Let me elaborate.
I definitely think bottom balancing is the way to go. However, I do not believe that it allows all the cells to reach the knee at the same point. My experience has shown otherwise, and I had a hole in the back of my Prius' windshield due to overcharging cells, so it definitely IS a problem.
Nearly all the problems with lithium cells has to do with heat build up, and it happens at both ends of the charge. You can drain a 100% full battery down at a high C rating, and it won't REALLY start warming up until the last bit. Same with charging. Jack has proven this in his testing. You can charge a cell to 95% capacity, and it will get warm. But, you pack that last 5% in, and that is when it really starts warming up.
My solution is to completely avoid the two ends of the spectrum. To do this you really want some sort of BMS. You can do it manually, but it is a pain. My version of a BMS is not really a Battery Management System as much as a Battery Monitoring System though.
Let me start by explaining the charging side. Normal lithium cells go through the same 2 stage charging as a lead acid battery goes through. The first stage is constant current. You charge your cells at say 10A until they hit say 3.6V. After that, it switches to constant voltage, and the current drops to maintain that 3.6V. This second stage is where you get that last 5% and where your heat comes from. I would suggest giving up that 5% and just eliminating the constant voltage stage. You loose 5% battery capacity, but you gain cell life. So, now you're using this on a full pack of batteries, this means you need to monitor every cell's voltage. With a bottom balanced pack, your weakest cell is going to hit 3.6V first. Once that cell hits 3.6V you are done. You are always limited by your weakest cell. This is how it should be so you don't beat that cell to death. That is pretty much it for charging.
Some guys do this all manually. They identify their weakest cell, and they watch the pack voltage at that point where that lowest cell hits 3.6V. They program their charger to shut off at whatever pack voltage it is at when that weak cell hits 3.6V. This works, but doesn't account for any cell drift or degradation. Thus I like monitoring each cell a bit more. However, it does work.
The discharge side IMO has to be handled with a amp hour meter / counter. Its best to capacity test all of your cells in your pack so you know what is the weakest, and you know that you have 100Ah or whatever to use, so you can say once you hit 70Ah used, you're done. This also gives you a great 'fuel gauge' that is dead on. There is no guessing like with a voltmeter.
If you check out my bms thread, I have a system pretty much designed. I never really got my Ah counter working with enough accuracy where I was comfortable using it. However, the charging side of it worked great and it was simple and cheap to build.
http://ecomodder.com/forum/showthrea...tem-20445.html