Rounding error needs to considered for an odometer or the Ultra-Gauge (not sure about Scan-Gauge; haven't used one), because a digital readout has a precision of only 1/10th of a mile (i.e. one digit). For calibrating distance U-G recommends using an accurately measured mile. However, a reading of 1.0 miles on U-G readout could be anywhere between 0.95 to 1.14 miles, assuming the U-G uses standard round practice. This means that when using this method (1 mile distance) the precision has a broad range of -5% to +4%, even if the test distance is precisely measured and marked (google earth or GPS). Calibrated this way, for an indicated 40 MPG, the true MPG could be anywhere between 38 and 41.6 MPG. It may be possible to reduce the uncertainty range by carefully watching for the OD to just click over to a new tenth, then start the calibration drive and carefully watch for the next mile to just click over. However, this means that the starting and ending points are at arbitrary (not predetermined) locations, meaning that using a GPS and traveling slowly would be necessary.
While a longer test distance would seem to result in a more accurate calibration because of less possible rounding error, the question then becomes how precisely can a longer, actually traveled test distance (say 10 or 100 miles) be measured? That precision would be harder to verify than a one-mile distance.
|