It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02s. What does this imply for the accuracy of the standard cesium clock in measuring a time interval of 1s ?
Apne doubts clear karein ab Whatsapp par bhi. Try it now.
Total time `= 100` years `=100 xx365xx24xx60xx60 s` <br> Error in 1 second `=0.02//100xx365xx24xx60xx60` <br> `=6.34xx10^(-12) s` <br> `:.` Accuracy of 1 part in `10^(11)` to `10^(12)`