It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s?
Time allowed to run (T) = 100 years
= 100×365×24×60×60 sec
∴ T = 3.15×109 s
Difference after 100 years (d) = 0.02 s
In 1 s the time difference shown by clock,
∴ The accuracy of the standard caesium clock in measuring time interval of 1s is given by,
A = s ≈ 1.5×1011 s