The home of theoretical physics group and quantized musings blog at the University of Nevada, Reno

A data archive for storing precision measurements [Physics Today]

D. Budker and A. Derevianko, Physics Today, September 2015, page 10.

Precision measurements are essential to our understanding of the fundamental laws and symmetries of nature.

Traditionally, fundamental symmetry tests focused on effects that are either time independent or subject to periodic modulation due to Earth’s rotation about its axis or its revolution around the Sun. In recent years, however, attention has been drawn to time-varying effects, starting with the searches for a possible temporal variation of fundamental “constants.” Even more recently, researchers are looking for transient effects1 and oscillating effects2 due to ultralight bosonic particles that could be components of dark matter or dark energy.

To search for nonuniform dark energy or dark matter, researchers have proposed networks of atomic magnetometers and clocks.1The readings of remotely located network sensors are synchronized—for example, using the timing provided by GPS—and analyzed for specific transient features. Also being discussed are hybrid networks consisting of different types of sensors that would be sensitive to different possible interactions with the dark sector (see http://www.nature.com/nphys/journal/v10/n12/extref/nphys3137-s1.pdf).

A compelling example of time-stamped and stored datasets is the orbit and clock estimates of the Global Navigation Satellite Systems (GNSS) available through the International GNSS Service (http://igscb.jpl.nasa.gov). This service is the backbone of modern precision geodesy. The available multiyear archival data can be used to search for transient variations of fundamental constants associated with the galactic motion through the dark-matter halo (see http://www.dereviankogroup.com/gps-dm/).

The field of precision measurement appears to be undergoing a paradigm shift, with new theoretical and experimental ideas sprouting almost daily. For instance, reanalysis of data from using atomic dysprosium to look for the variation of the fine-structure constant and to test Lorentz invariance has set new limits on the scalar dark matter.3,4 That has been made possible by the existence of well-documented, accessible data sets stored electronically.

An example of a new experimental idea is using precise beam-position monitors in particle accelerators to test for specific types of Lorentz-invariance violations.5

Inspired by all those exciting developments, we propose that data streams from any ongoing precision measurements be time-stamped and stored for possible future analysis. We are convinced that the cost of data storage and GPS timing is relatively small and that the data storage will be straightforward to implement technically, though, of course, the price and complexity crucially depend on the precision of the time stamp and the data rate. Conversely, failing to time-stamp and store the data is likely to be an enormous waste. The search for transient effects of the dark sector is already a good motivation to create a data archive, and additional ideas of how to use such data are likely to emerge in the future.

What information should be time-stamped and recorded as a raw data stream? Data from optical and matter interferometers, experiments measuring parity violation and looking for permanent electric dipole moments, precision-measurement ion traps, all precision experiments with antimatter, and, by default, anything measured precisely.

We live in the age of Google and GPS; our thinking about experimental data should be keeping up with the times!