Very good analysis JR. If we calculate a few points, we can draw a curve along those points to get a reasonably accurate graph.
The rate of charging in your examples are over optimistic for the Bolt (perhaps accurate for some Teslas?).
Here are some excerpts from what I've read concerning CSS DCFC for the Bolt:
Quote:
To get CLOSE to 90 miles in 30 minutes (based off the 238 mile EPA range), you will need to charge from a 125 amp DCFC.
The most I've ever seen is 22.27 kWh in a 30 minute session. As even DCFC'ing is not 100% efficient (probably a 5% loss or so), that 22.27 kWh from the station is probably more like 21.2 kWh into the actual battery.
21.2 kWh / 60 usable = 35.33% of the usable battery. 35.33% of 238 EPA miles = 84 EPA miles in 30 minutes.

Quote:
SOC was 33% according to the station when I started charging. Immediately got a 38 kW/100 amp charging rate right after plugging in. It climbed up to around 45 kW/125A over the next 57 minutes. Saw a peak rate of 46 kW according to the car during the session.
When SOC hit 5354%, the Bolt ramped down to 100 amps and a 3738 kW charge rate, where it stayed till the end of the 30 minute session.
So 3366% SOC in 30 minutes. 20 kWh in 30 min for a total of $3 (10 cents x 30 min), or 17 cents/kWh.
I plugged in again for the heck of it, and charge rate went right to 38 kW. Stayed there till it hit 70% SOC, when it ramped down to 23/24 kW and 6061 amps. Stayed there through 81% SOC, when I finally pulled the plug after 20 more minutes. Charged 9 more kWh.

Ideally, we would pull off to charge exactly when the EV hit 0% remaining capacity. Getting as close to that as possible while leaving a buffer makes sense, so 10% (approx 24 mile range) is reasonable.
Since the best we can hope for in the Bolt is 40 kW/hour, that puts us at about a 75% charge in 1 hour. Minus the 10% buffer, we have 65% of usable range per charge for an hour of charging.
That drastically changes things, increasing the time spent charging, increasing the number of charging sessions, and decreasing the range per charge.
Also, shouldn't the MPH vs Range curve be the inverse of what's shown?
Range should drop off at an increasing rate with speed, not a decreasing rate.
Assuming 40kWh per hour and 65% usable range for a 1hr charge (charging from 10% to 75% in 1hr), I plugged the numbers into Excel.
At 50 MPH, travel time is 20hrs and charge time is 3hrs for a total trip time of 23hrs.
At 71 MPH, travel time is 14.04hrs and charge time is 6hrs, for a total trip time of 20.04hrs.
At 95 MPH, travel time is 10.53hrs and charge time is 11 for a total trip time of 21.53hrs.
Based on the best assumptions I have, it looks like the most time efficient speed for a 1000 mile trip is about 70 MPH.
Interestingly, when I change the trip length to 500 miles, or 3000 miles, 75 MPH is close to the most time efficient speed.
Finally, one major problem with the formula is that it assumes you will spend an entire 1 hr to recharge even if you are just 1 mile shy of your destination when you reach 10% remaining battery capacity. In reality, if you hit 10% remaining and only have 15 miles to reach your destination, you'll likely keep going and skip that final hour of charging. Even if you can't quite make the final distance, you will only spend as much time charging as needed to get to the destination, assuming you can charge once the trip is completed.