An atomic clock is a type of clock that measures time by counting the resonant frequencies of atoms. The probability of 1 second error in 3 million years is only 22,522%.
The first atomic clocks are masers with counting equipment added. Today’s best atomic clocks are advanced physics instruments powered by cold atoms and inert fountains. The American National Standards Institute uses clocks that work with a daily deviation of 10^-9 seconds, which is the margin of error of the masers used. Atomic clocks form the National Atomic Time (IAT), a standard for continuous and stable time measurement. For other measurements, Coordinated Universal Time (CUT), derived from the IAT but synchronized with the passage of day and night, is used.
The first atomic clock was made in 1949 at the U.S. National Bureau of Standards (NBS). The first accurate atomic clock was made by Louis Essen at the British National Physical Laboratory in 1955 by measuring the resonance of the caesium-133 atom.
In August 2004, NIST scientists introduced the first atomic clock on the scale of a computer chip.