GPS satellites are usually in the Medium Earth Orbit with a typical altitude of 20,200 km. In such an orbit, objects move with a speed of 3072 m/s relative to an observer standing on the ground, (a) A clock on the satellite is ticking once every second. To the observer on the ground, what is the time interval between the ticks? (b) Suppose the satellite is sending you radio signals (traveling at speed of light) and you use it to calculate the distance (d = vt). If you don\'t take into account the time dilation, how long will it take for your distance calculations to be off by 100 m? Solution speed of the satellite v = 3872 m/s Relativistic time dilation t = t’/sqrt(1-v2/c2) v/c = 3872/3e+8 = 1.29e-5 v2/c2 = 1.67e-10 the above term being very small we can approximate t = t’(1+(1.67e-10)/2) = t’ (1+0.835e-10) For the ground observer the time interval between the ticks is 1+0.835e-10 s i.e. 83.5 ps more d=vt here v= c= 3e+10 m/s for the distance to be off by 100m 100 = 3e+10t t = 33.3e-10 s.