An atomic clock is a type of clock that uses an atomicresonancefrequency standard as its timekeeping element. They are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the frequency of television broadcasts, and in global navigation satellite systems such as GPS.

Atomic clocks do not use radioactivity, but rather the precise microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers. Currently, the most accurate atomic clocks are based on absorption spectroscopy of cold atoms in atomic fountains such as the NIST-F1.

National standards agencies maintain an accuracy of 10−9 seconds per day (approximately 1 part in 1014), and a precision set by the radio transmitter pumping the maser. The clocks maintain a continuous and stable time scale, International Atomic Time (TAI). For civil time, another time scale is disseminated, Coordinated Universal Time (UTC). UTC is derived from TAI, but synchronized, by using leap seconds, to UT1, which is based on actual rotations of the earth with respect to the solar time.

History

The idea of using atomic vibration to measure time was first suggested by Lord Kelvin in 1879. The practical method for doing this became magnetic resonance, developed in the 1930s by Isidor Rabi. The first atomic clock was an ammoniamaser device built in 1949 at the US National Bureau of Standards (NBS, now NIST). It was less accurate than existing quartz clocks, but served to demonstrate the concept. The first accurate atomic clock, a caesium standard based on a certain transition of the caesium-133 atom, was built by Louis Essen in 1955 at the National Physical Laboratory in the UK. Calibration of the caesium standard atomic clock was carried out by the use of the astronomical time scale ephemeris time (ET). This led to the internationally agreed definition of the latest SI second being based on atomic time. Equality of the ET second with the (atomic clock) SI second has been verified to within 1 part in 1010. The SI second thus inherits the effect of decisions by the original designers of the ephemeris time scale, determining the length of the ET second.

May 2009- JILA's strontium optical atomic clock is now the world's most accurate clock based on neutral atoms. Shining a blue laser onto ultracold strontium atoms in an optical trap tests how efficiently a previous burst of light from a red laser has boosted the atoms to an excited state. Only those atoms that remain in the lower energy state respond to the blue laser, causing the fluorescence seen here. Photo credit: Sebastian Blatt, JILA, University of Colorado

Since the beginning of development in the 1950s, atomic clocks have been made based on the hyperfine (microwave) transitions in hydrogen-1, caesium-133, and rubidium-87. The first commercial atomic clock was the Atomichron, manufactured by National Company. More than 50 were sold between 1956 and 1960. This bulky and expensive machine was subsequently replaced by much smaller rack-mountable devices, such as the Hewlett-Packard model 5060 caesium frequency standard, released in 1964.

In the late 1990s four factors contributed to major advances in clocks:

In August 2004, NIST scientists demonstrated a chip-scaled atomic clock. According to the researchers, the clock was believed to be one-hundredth the size of any other. It was also claimed that it requires just 75 mW, making it suitable for battery-driven applications. This device could conceivably become a consumer product.

In March 2008, physicists at NIST demonstrated a quantum logic clock based on individual mercury and aluminium ions. These two clocks are the most accurate that have been constructed to date, with neither clock gaining nor losing at a rate that would exceed a second in over a billion years.

Mechanism

Since 1967, the International System of Units (SI) has defined the second as the duration of 9,192,631,770 cycles of radiation corresponding to the transition between two energy levels of the caesium-133 atom.

This definition makes the caesium oscillator (often called an atomic clock) the primary standard for time and frequency measurements (see caesium standard). Other physical quantities, like the volt and metre, rely on the definition of the second as part of their own definitions.

The actual "clock" of an atomic clock consists of an electronic oscillator operating at microwave frequency. The oscillator is arranged so that its frequency-determining components include an element that can be controlled by a feedback signal. The "atomic" part is used to generate a feedback signal to keep the oscillator tuned to the correct frequency.

The core of the atomic clock is a tunable microwave cavity containing the gas. In a hydrogen maser clock the gas emits microwaves (the gas mases) on a hyperfine transition, the field in the cavity oscillates, and the cavity is tuned for maximum microwave amplitude. Alternatively, in a caesium or rubidium clock, the beam or gas absorbs microwaves and the cavity contains an electronic amplifier to make it oscillate. For both types the atoms in the gas are prepared in one electronic state prior to filling them into the cavity. For the second type the number of atoms which change electronic state is detected and the cavity is tuned for a maximum of detected state changes.

This adjustment process is where most of the work and complexity of the clock lies. The adjustment tries to correct for unwanted side-effects, such as frequencies from other electron transitions, temperature changes, and the "spreading" in frequencies caused by ensemble effects. One way of doing this is to sweep the microwave oscillator's frequency across a narrow range to generate a modulated signal at the detector. The detector's signal can then be demodulated to apply feedback to control long-term drift in the radio frequency. In this way, the quantum-mechanical properties of the atomic transition frequency of the caesium can be used to tune the microwave oscillator to the same frequency, except for a small amount of experimental error. When a clock is first turned on, it takes a while for the oscillator to stabilize. In practice, the feedback and monitoring mechanism is much more complex than described above.

A number of other atomic clock schemes are in use for other purposes. Rubidium standard clocks are prized for their low cost, small size (commercial standards are as small as 400 cm3) and short-term stability. They are used in many commercial, portable and aerospace applications. Hydrogen masers (often manufactured in Russia) have superior short-term stability compared to other standards, but lower long-term accuracy.

Often, one standard is used to fix another. For example, some commercial applications use a Rubidium standard periodically corrected by a GPS receiver. This achieves excellent short-term accuracy, with long-term accuracy equal to (and traceable to) the U.S. national time standards.

The lifetime of a standard is an important practical issue. Modern rubidium standard tubes last more than ten years, and can cost as little as US$50. Caesium reference tubes suitable for national standards currently last about seven years and cost about US$35,000. The long-term stability of hydrogen maser standards decreases because of changes in the cavity's properties over time.

Modern clocks use magneto-optical traps to cool the atoms for improved precision.

Power consumption

Power consumption varies enormously, but there is a crude scaling with size. Chip scale atomic clocks can use power on the order of 100 mW; NIST-F1 uses power orders of magnitude greater.

Research

Most research focuses on the often conflicting goals of making the clocks smaller, cheaper, more accurate, and more reliable.

New technologies, such as femtosecond frequency combs, optical lattices and quantum information, have enabled prototypes of next generation atomic clocks. These clocks are based on optical rather than microwave transitions. A major obstacle to developing an optical clock is the difficulty of directly measuring optical frequencies. This problem has been solved with the development of self-referenced mode-locked lasers, commonly referred to as femtosecond frequency combs. Before the demonstration of the frequency comb in 2000, terahertz techniques were needed to bridge the gap between radio and optical frequencies, and the systems for doing so were cumbersome and complicated. With the refinement of the frequency comb these measurements have become much more accessible and numerous optical clock systems are now being developed around the world.

Like in the radio range, absorption spectroscopy is used to stabilize an oscillator — in this case a laser. When the optical frequency is divided down into a countable radio frequency using a femtosecond comb, the bandwidth of the phase noise is also divided by that factor. Although the bandwidth of laser phase noise is generally greater than stable microwave sources, after division it is less.

The two primary systems under consideration for use in optical frequency standards are single ions isolated in an ion trap and neutral atoms trapped in an optical lattice. These two techniques allow the atoms or ions to be highly isolated from external perturbations, thus producing an extremely stable frequency reference.

Optical clocks have already achieved better stability and lower systematic uncertainty than the best microwave clocks. This puts them in a position to replace the current standard for time, the caesium fountain clock.

Atomic clocks are used to generate standard frequencies. They are installed at sites of time signals, LORAN-C, and Alpha navigation transmitters. They are also installed at some longwave and mediumwave broadcasting stations to deliver a very precise carrier frequency, which can also function as standard frequency.

Furthermore, atomic clocks are used for long-baseline interferometry in radioastronomy.

Radio clocks

A radio clock is a clock that automatically sets itself to atomic time by means of government radio time signals received by a radio receiver. Many retailers market radio clocks inaccurately as "atomic clocks"; although the radio signals they receive come from atomic clocks, they are not atomic clocks themselves. They provide a way of getting high-quality atomic-derived time over a wide area using inexpensive equipment. Although the government time broadcasts are themselves extremely accurate, many consumer radio clocks only set themselves once a day, and so are only accurate to about a second. To take advantage of the full accuracy of the time signals, instrument grade time receivers must be used. There is a transit delay of approximately 1 ms for every 300 kilometers (186 mi) the receiver is from the transmitter.

Time signals produced by atomic clocks are broadcast by government-run longwave time radio transmitters in many countries around the world, such as DCF77 (Germany), HBG (Switzerland), JJY (Japan), MSF (United Kingdom), TDF (France) and WWVB (United States). These signals can be received far outside their country of origin (JJY can sometimes be received even in Western Australia and Tasmania at night), so there are very few regions of the world where precision atomic-derived time is not available.

Global Positioning System

The GPS system provides very accurate timing and frequency signals. A GPS receiver works by measuring the relative time delay of signals from four or more GPS satellites, each of which has three or four onboard caesium or rubidium atomic clocks. The four relative times are mathematically transformed into three absolute distance coordinates and one absolute time coordinate. The time is accurate to within about 50 nanoseconds. However, inexpensive GPS receivers may not assign a high priority to updating the display, so the displayed time may differ perceptibly from the internal time. Precision time references that use GPS are marketed for use in computer networks, laboratories, and cellular communications networks, and do maintain accuracy to within about 50 ns.