![]() |
If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
![]()
According to an article in the November 2004 edition of
Scientific American, the GPS satellites' clocks have a 7 microsec per day effect due to relativistic motion and a 45 microsec per day effect due to the lower gravity in orbit. The two effects partially cancel, so the net adjustment is 38 microseconds. It is left as an exercise for the student to determine the effect of an uncompensated time value on position values. |
#2
|
|||
|
|||
![]()
Well, I'm doing really well if I can keep the ILS needle off the end of
the scale... |
#3
|
|||
|
|||
![]() |
#4
|
|||
|
|||
![]()
It's many miles. Nanosecond accuracy is required for the current ~10
meter accuracy. Linear extrapolation would lead to ~10 kilometers. The speed of light is about 30 cm per nanosecond (a foot, give or take a bit) so 10 meters calls for a resolution of about 30 nsec. Not really a big deal these days (I design systems that must resolve to better than 0.5 nsec). But the extrapolation to 10 km (which is indeed how far light travels in 38 microseconds) doesn't work. GPS receivers don't rely on an internal, independent clock. They synchronize to the satellite - which is a sloppy way of putting it anyway. The real issue is the difference between travel times of signals from different satellites, not the absolute travel time. Thus what matters here (to a first approximation, anyway) is that the satellites are synchronized to each other, not to any earthbound clock. To a second approximation, it is important that the almanac be right. In other words, the satellite needs to be where it is expected to be at the time it transmits. However, now the errors measured in microseconds are much smaller - the key parameter is not how far light travels in those microseconds (about 10 km as you noted), but how far the satellite travels in those microseconds (more properly measured in centimeters rather than kilometers). Of course the error, if not corrected, is cumulative. After a few weeks it would be quite significant. Michael |
#5
|
|||
|
|||
![]()
Stan Gosnell wrote:
(Everett M. Greene) wrote in : According to an article in the November 2004 edition of Scientific American, the GPS satellites' clocks have a 7 microsec per day effect due to relativistic motion and a 45 microsec per day effect due to the lower gravity in orbit. The two effects partially cancel, so the net adjustment is 38 microseconds. It is left as an exercise for the student to determine the effect of an uncompensated time value on position values. It's many miles. Nanosecond accuracy is required for the current ~10 meter accuracy. Linear extrapolation would lead to ~10 kilometers. Is it as great as that? The relativistic effects would be the same for all satellites, so, while a clock on Earth may disagree with a clock on the satellite, all satellites would disagree by the same amount. Therefore, while the uncompensated effect may well be several kilometres, wouldn't it always be the _same_ kilometres? -- Nick |
#6
|
|||
|
|||
![]()
Nick wrote in
: Is it as great as that? The relativistic effects would be the same for all satellites, so, while a clock on Earth may disagree with a clock on the satellite, all satellites would disagree by the same amount. Therefore, while the uncompensated effect may well be several kilometres, wouldn't it always be the _same_ kilometres? No. They originally tried it without any corrections, because many of the design engineers didn't think it would matter. Turned out it did matter. The theoretical error is large, although I can't remember the exact numbers. BTW, it was September 2004, not November. The GPS receiver doesn't really know the time, it just synchronizes with the time reported by the satellites. If it thinks the time is different than what it is, then it thinks it's in the wrong position, because it calculates position based on the difference in time it takes the signals to travel from different satellites. The SciAm article has a fuller explanation, and you can also find several explanations on the net. You can start at http://www.gpsinformation.net. -- Regards, Stan "They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." B. Franklin |
#7
|
|||
|
|||
![]()
In article ,
Stan Gosnell wrote: (Everett M. Greene) wrote in : According to an article in the November 2004 edition of Scientific American, the GPS satellites' clocks have a 7 microsec per day effect due to relativistic motion and a 45 microsec per day effect due to the lower gravity in orbit. The two effects partially cancel, so the net adjustment is 38 microseconds. It is left as an exercise for the student to determine the effect of an uncompensated time value on position values. It's many miles. Nanosecond accuracy is required for the current ~10 meter accuracy. Linear extrapolation would lead to ~10 kilometers. An easy back-of-the-envelope way to look at this is that light travels roughly a foot per nanosecond. 38 microseconds = 38000 nanoseconds. Thus, in 38 microseconds, light will travel about 38000 feet, or about 7 miles. -- Dane |
#8
|
|||
|
|||
![]() Others have covered this amply already it's really not about hor far light goes in 38us. However, I do want to plug one of my favorite internet tools: The Google Calculator. If you google on "38 microseconds * c in miles" it'll print out 7.079 for you. It does dimensional analysis and has a half decent library of constants, too. By using 'in' at the end of the calculation you get to specify what form you'd like to see your result. -- dave j |
Thread Tools | |
Display Modes | |
|
|