With the data variation that was apparent from the hand held GPS, a test was devised...
Set the GPS to sample rates of:
1 sample per second
2 "
4 "
10 "
The object of this test is to determine if sampling faster than the arbitrarily setting of one sample every two seconds could be improved on.
The maximum sample rate of the GPS is 1 sample per second.
We were on a road trip on the Interstate and traveling roughly at the same speed for the trip, so the GPS was set to record distance traveled for the four settings listed above.
The data was downloaded and analyzed in a spreadsheet. The goal was find a setting or settings that minimized the sample variation for distance traveled. This is key, since the GPS calculates the apparent speed based on distance and time. So if the recorded distance varies, then the speed calculation will as well.
The first graph shows the change in distance in the Y-axis in this graph, since on this portion of the highway, we were moving primarily north. The blue line is the raw data, and the pink line is the smoothed data by making three passes of a 5 sample average. Note: a high order polynomial was not used for this averaging, since it has a tendency to oscillate and not follow the data accurately, especially close to points at 58380 or so.
Averaging the raw data allows the data to be "normalized" as shown in the second graph. Now one can see the true variation of the raw data. Three standard deviations for the data was then calculated by the spreadsheet (roughly 99% confidence level).
After performing these operations for all four sample rates, here is what the data spread for each setting looks like...
Notice that the sample rate of 4 seconds did not work as well as one would expect. It may be that several semi trailers that drove by may have partially blocked the satellite signals enough the cause higher variation.
The fastest rate on the GPS seems to tax the unit with the high overhead of reading satellite data, calculating location, and saving this data to the internal memory.
Setting the unit to sample every 10 seconds would reduce the variation, but also severely limit the sheer number of samples collected during a coast down test. And the data variation is not much better than the 2 second setting.
It does appear that the setting of 2 seconds for each sample is a good compromise between rate and data variability.
Jim.