Thursday, November 7, 2013

Stunning Accuracy: The Atomic Clock

An atomic clock is a clock device that uses an electronic transition frequency in the microwave, optical, or ultraviolet, region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element. Atomic clocks are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the wave frequency of television broadcasts, and in global navigation satellite systems such as GPS.

The principle of operation of an atomic clock is not based on nuclear physics, but rather on atomic physics and using the microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers at room temperature. Currently, the most accurate atomic clocks first cool the atoms to near absolute zero temperature by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example of this is the NIST-F1 atomic clock, the U.S. national primary time and frequency standard.

The accuracy of an atomic clock depends on two factors. The first factor is temperature of the sample atoms—colder atoms move much more slowly, allowing longer probe times. The second factor is the frequency and intrinsic width of the electronic transition. Higher frequencies and narrow lines increase the precision.

National standards agencies in most industrialized and semi-industrialized countries maintain an accuracy of 10−9 seconds per day (approximately 1 part in 1014), and a precision set by the radio transmitter pumping the maser. These clocks collectively define a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but approximately synchronized, by using leap seconds, to UTI, which is based on actual rotations of the earth with respect to the solar time.

History

The idea of using atomic transitions to measure time was first suggested by Lord Kelvin in 1879. Magnetic resonance, developed in the 1930s by Isidor Rabi, became the practical method for doing this. In 1945, Rabi first publicly suggested that atomic beam magnetic resonance might be used as the basis of a clock. The first atomic clock was an ammonia maser device built in 1949 at the U.S. National Bureau of Standards (NBS, now NIST). It was less accurate than existing quartz clocks but served to demonstrate the concept. The first accurate atomic clock, a caesium standard based on a certain transition of the caesium-133 atom, was built by Louis Essen in 1955 at the National Physical Laboratory in the UK. Calibration of the caesium standard atomic clock was carried out by the use of the astronomical time scale ephemeris time (ET). This led to the internationally agreed definition of the latest SI second being based on atomic time. Equality of the ET second with the (atomic clock) SI second has been verified to within 1 part in 1010.  The SI second thus inherits the effect of decisions by the original designers of the ephemeris time scale, determining the length of the ET second.

Since the beginning of development in the 1950s, atomic clocks have been based on the hyperfine transitions in hydrogen-21, carsium-133 and rubidium-87. The first commercial atomic clock was the Atomichron, manufactured by the National Company. More than 50 were sold between 1956 and 1960. This bulky and expensive instrument was subsequently replaced by much smaller rack-mountable devices, such as the Hewlett-Packard model 5060 caesium frequency standard, released in 1964.
In the late 1990s four factors contributed to major advances in clocks:

  • Laser cooling and trapping of atoms
  • So-called high-finesse Fabry-Perot cavities for narrow laser line widths
  • Precision laser spectroscopy
  • Convenient counting of optical frequencies using optical combs.

Chip-scale atomic clocks, such as this one unveiled in 2004, are expected to greatly improve GPS location.
In August 2004, NIST scientists demonstrated a chip-scale atomic clock. According to the researchers, the clock was believed to be one-hundredth the size of any other. It requires no more than 125 mW, making it suitable for battery-driven applications. This technology became available commercially in 2011.

http://en.wikipedia.org/wiki/Atomic_clock

No comments:

Post a Comment