On Oct 30, 10:08*pm, HIPAR wrote:
On Oct 30, 4:44*pm, macpacheco wrote:
NASA JPL operates a Global Differential GPS system with worldwide
coverage. *They claim 10cm performance.
'The NASA Global Differential GPS (GDGPS) System is a complete, highly
accurate, and extremely robust real-time GPS monitoring and
I believe the John Deere Starfire commercial service is based upon the
Why can't airplanes use it ?
- 10 cm performance in real time kinematic or post processing ?
- 10 cm performance at what confidence level ? SBAS confidence levels
are 99.99% - 99.99999% performance, instead of the usual 50-95%.
- Any system with that kind of performance today HAS to use semi
codeless at the end user level (L2 band). The FAA/ICAO/EASA has this
ARNS paranoia mentality that considers any usage of signals outside
ARNS protected bands a BIG no-no for end user equipment. While I don't
agree, I understand that decision within their paranoia mentality.
That's because the biggest bottleneck in SBAS performance today is
IONO corrections that must be applied on a grid basis, SoL end users
can't use semi codeless for autonomous IONO corrections. SBAS iono
grids are broadcast on a 5x5 degree spacing, and their calculation is
extremely dependent on station density.
- Finally, end user receivers are not required to have extensive multi
path rejection, that's the second biggest error factor in SBAS.
- If SBAS end users were allowed to use semi codeless today, and L2C /
L5 as it became healthy for IONO corrections, the current VPL/HPL
(vertical/horizontal protection level) that range from 15-50 meters
today, would go down to 5-20 meters easily, but that's 5 meters at
99.999% level (or better), which easily means measured performance
would be sub-meter. Add extensive multi path rejection requirement and
UDRE would come down from 3 meters to 1.5 meters easily. With
protection levels in the 2-10 meters range. UDRE level includes a
multi path error budget for end user equipment. A smarter approach
would have been to exclude receiver errors from UDRE levels, and
determine receiver errors (including multipath) at equipment
certification time, and have the end user equipment add its own error
to the UDRE levels, providing for a more competitive landscape for
equipments with better multi path rejection (would be able to achieve
LPV200 approaches under scenarios that are forbidden today, and could
even provide a basis for CAT II approaches today).
Theoretically if you plug in SBAS corrections into an end user
receiver that uses the starfire logic, you should get half meter
performance or better. The difference is due to the absence of carrier
phase information in the SBAS data stream.
SBAS uses a 250bps data stream, if you remove the IONO grid from that
data stream, you take 80% of the data away, which would allow for a
five fold increase in clock/ephemeris updates. Or you could easily add
carrier phase information and still increase the update rate. LAAS has
carrier phase information for instance.
Finally, SBAS ephemeris/clock updates accuracy would improve
significantly if all SBAS systems use each other's pseudoranging, in
WAAS cases that would mean at least using three strategically selected
MSAS stations and four strategically selected EGNOS stations, allowing
for almost worldwide ephemeris/clock coverage. Today satellites flying
over the Indian Ocean get Not Monitored flags in WAAS, while
satellites over the south pacific get NM flags on EGNOS. EGNOS fares a
little better due to having one station in South Africa and one in
French Guyana (South America very close to the equator line).
If all SBAS systems used 100% of each other reference stations, they
would be able to provide worldwide clock/ephemeris updates (all
satellites) at UDRE 3 for all satellites north of 30S latitude and
mostly UDRE 3 for satellites south of that line.