Improving GPS Accuracy by POST PROCESSING
By James Harris  Revised 3/20/2010
Summary:  With data collection times of from 1 minute to 16 minutes at a fixed position, a differential post processed position accuracy of 1.3 meter down to less than 1/2 meter is achievable a measured 93.1 % of the time.  Indeed, for 16 minutes of logged data, even the worst case error was just under 0.5 meters.  The average and the median position accuracy ranged from less than 0.6 meters for one minute of data to less than 0.25 meters for sixteen minutes of data.

CAUTION - USER BEWARE:  Obtaining pseudo range and carrier phase data involves using undocumented commands, with a risk of possible loss of  information stored in the GPS unit and even damage to the internal programming, which could make the GPS unit unusable.  Any use of undocumented commands with Garmin units or the data derived with such  commands is totally and absolutely at your own personal risk. 

Background:  In December of 2001, I learned that by using undocumented commands, Pseudo Range and Carrier Phase data is available from the serial data port of many Garmin GPS units.  This opened the door to potentially obtaining much better accuracy than what is available from basic GPS, DGPS, or WAAS GPS. My initial information about the possibility of logging PR (Pseudo Range) and CP (Carrier Phase) data came from the source at:   http://artico.lma.fi.upm.es/numerico/miembros/antonio/async/  This site provides a program, along with its source code, for logging data and a second program to convert the logged binary files into the RINEX format.  With some GPS receivers, the version of this program I used couldn't transfer all the data in the time available when there were five or more satellites visible, which produced errors in the logged data.  That problem did not exist with the GPS III.

Shortly after my earliest attempts using these programs, I discovered that the University of Nottingham (UK) also provided a data logging program called GRINGO and a Post Processing program called P4 that allowed the logging of GPS PR (Pseudo Range) and CP (Carrier Phase) data from the Garmin G12, 12XL, II+, III+, ETREX, EMAP, GPSV and G76 receivers.  The university's web site indicates that these programs are now unavailable, which is in my opinion a tragedy.  See: <http://www.nottingham.ac.uk/iessg/gringo/> My work using GRINGO and P4 verified that the PR (Pseudo Range) and CP (Carrier Phase) data from GARMIN GPS units could be differentially PP (post processed)  with a huge improvement in the accuracy of a stationary GPS position.   (Differential post processing uses PR and CP data from a base station of precisely known position to greatly improve the GPS determined position.  All of my references to PP (post processing) mean differential post processing.) 

My results were chaotic for the first few months until I realized that I needed to evaluate the position errors that I was seeing in a systematic way.  That meant using statistics rather than simply looking at individual results.  I settled upon calculating the WC (worst case), 95 % (i.e., 95 % of errors limit), average, and median error values for each set of positions.  I eventually settled upon evaluating position accuracy for discrete logging and averaging times of 1 min, 2 min, 4 min, 8 min, 16 min, and etc. Table 1 and Graph 1 provide a comparison of the error for WAAS GPS, post processed PR, and post processed CP positions.   Graph 1 also vividly shows the promise and the horror of using a CP determined position. With my eyes locked tightly on that glorious potential accuracy of using CP data, I spent the next five or more years chasing that elusive pot of gold.  Finally, I was forced to give up the dream of using CP data and accept the lesser benefit in accuracy provided by using post processed PR data.  And as you can see from Graph 1, that is a worthwhile improvement in accuracy.  But the dream never really died, and I kept kicking the problem around until the obvious finally dawned on me.  PP (Post Processing) gives two positions!  The PR determined position can be used as a decent estimate to throw out those wildly erroneous CP determined positions.  This was the workable test that I had been seeking since 2001.

 As a first estimate, the optimum test value was expected to be close to the sum of the PR95 % and the CP95% of errors limits for the minutes of logged data.  Except, those wild CP errors corrupt the result.  It turns out that using a pre-screening test of 2 X PR95% as a test of acceptable CP values produces a CPtstd95% value that works well.   The value below is a pragmatic value that minimize the 95 % and the WC errors for the measurements I have made.  

The test is simple.  If the absolute value of the (PR-CP) calculated positions is greater than 2.2 meters, discard the CP position and use the PR position instead.  It's that simple.  This test is harsh enough that some good positions are lost, but it avoids the really nasty errors. In a few days I had another flash of insight.  1) The PR and CP determined positions are randomly distributed around the true position.  2) The largest errors in either solution are relatively rare and seldom occur in both positions in one measurement.  Thus, taking the average of the PR and CP acceptable positions will rather consistently reduce the error more often then it increases the error compared to using the acceptable CP value alone.  This is especially true for the larger errors in position.   It pains me that some of the most accurate CP positions are lost, but  statistically, using the average of the PR and CP acceptable positions gives a practical improvement in the WC, 95 %, and average accuracy for data logging times from 1 to 8 minutes.  And there just isn't any effective way to identify those wonderfully accurate CP positions that are lost by using the average of the PR and the CP position.   With 16 minutes of logged data, using the average of the PR and CP acceptable positions still provides a smaller worst case error, but the cost is an increase in the 95% error limit.  With 32 minutes or more of logged data just use the acceptable CP values. If avoiding or minimizing the number of errors larger than MAX is of the highest importance to you, then perform a supplemental test to discard any positions for which the absolute value of (PR-CP) > Fraction(MAX).  Use Fraction between 2/3 and 1 depending upon how important it is to avoid values greater than MAX.  This test is harsh in the extreme.  You will be discarding many positions whose error was acceptable - but there just isn't any way to reliably identify them.  Or, if you can afford the time, simply log data for 4 minutes or more.  I logged GPS data from a Garmin GPS V unit with its antenna centered above an NGS surveyed site marker.  The accurately known true coordinates of this  site were obtained from an NGS data sheet obtained from the Internet.  Both the NGS surveyed mark and the reference base station position are given using the same NAD 83 datum.  The PR data and the CP data were then differentially post processed with the program DP4, that came with GRINGO. (NOTE:  the site position had to be corrected to the same date as the position information for the GPS base station.  A program to do this is available from the internet.) Attached is an excel spreadsheet with a table and two graphs of the results of my work.  Final data from three accurately known positions is presented. Six or more satellites must be available to achieve the identified accuracy. A mask angle of 15 degrees, a S/N ratio of 6, and PR smoothing were used. The base station data was interpolated to give 1 second increments.  I deliberately chose a reference station that was further away (i.e., approximately 53 kilometers distant),  then the best available station and that only recorded the data at intervals of 15 seconds rather than the 5 second intervals available from the best choice reference station.

Using 1 second data is critically important with only 1 minute of logged data. A Microsoft Excel file with a table of the measured errors and 2 graphs from that data plotting positional error Vs. minutes of logged data is attached. Graph 2 shows the accuracy obtainable by using either the average of the (PR and CP) position, or the PR position if the difference between the PR and CP position is  >  2.2 meters. The information presented meets my needs for accuracy.  I share the tables and graphs in the belief that they are a significant improvement over nothing at all.  Please, use them cautiously - if at all. I think, from my experiences with the data, that it may be possible to identify from the residuals and other tools included in DP4 when some the largest errors exist.  Step changes, ramping changes, or spikes were present in the individual satellite residuals with the largest error, though not in all of the larger errors.  My personal experience is that with nine or ten satellites in the logged data, removing a satellite that exhibits step changes, ramp changes, or spikes in the residual from the CP solution improves the accuracy of the result.  (This has not been done in the data I present.)  With fewer satellites, removing a problem satellite can actually increase the error.  (In my defense, that's why I spent 5 years checking things like that.)    Another way to improve confidence in the position accuracy is to take the average position from  post processing your logged position data using three to five base stations located in a circle about your position.  (The material I've supplied is from using only one base station to determine the position.)  BTW:  not using smoothing gives a second PR position.  

James Harris

P.S.  I have not personally had any bad experience with the publicly released versions of the programs that I have used.  I did have one lock up problem with a Garmin eMap GPS while using an "experimental" program. Powering the GPS off and then going through and resetting the Garmin interface to its default, with possibly a second power off and power on cycle cleared things up.

If anyone is interested in buying the full capability program, Chris Hill's link can be found at the following link.

http://www.bigf.ac.uk/gringo/








The following graph  requires an XML reader.  The Microsoft Reader (free) can be downloaded HERE.

Spreadsheet Table