Derevianko Grouphttp://www.dereviankogroup.com
The home of theoretical physics group and quantized musings blog at the University of Nevada, RenoWed, 02 Aug 2017 22:26:33 +0000en-UShourly1https://wordpress.org/?v=4.8.244127344A data archive for storing precision measurements [Physics Today]http://www.dereviankogroup.com/a-data-archive-for-storing-precision-measurements-physics-today/
Wed, 02 Sep 2015 03:13:19 +0000http://www.dereviankogroup.com/?p=695D. Budker and A. Derevianko, Physics Today, September 2015, page 10. Precision measurements are essential to our understanding of the fundamental laws and symmetries of nature. Traditionally, fundamental symmetry tests focused on effects that are either time independent or subject to periodic modulation due to Earth’s rotation about its axis or its revolution around the Sun. […]

Precision measurements are essential to our understanding of the fundamental laws and symmetries of nature.

Traditionally, fundamental symmetry tests focused on effects that are either time independent or subject to periodic modulation due to Earth’s rotation about its axis or its revolution around the Sun. In recent years, however, attention has been drawn to time-varying effects, starting with the searches for a possible temporal variation of fundamental “constants.” Even more recently, researchers are looking for transient effects1 and oscillating effects2 due to ultralight bosonic particles that could be components of dark matter or dark energy.

To search for nonuniform dark energy or dark matter, researchers have proposed networks of atomic magnetometers and clocks.1The readings of remotely located network sensors are synchronized—for example, using the timing provided by GPS—and analyzed for specific transient features. Also being discussed are hybrid networks consisting of different types of sensors that would be sensitive to different possible interactions with the dark sector (see http://www.nature.com/nphys/journal/v10/n12/extref/nphys3137-s1.pdf).

A compelling example of time-stamped and stored datasets is the orbit and clock estimates of the Global Navigation Satellite Systems (GNSS) available through the International GNSS Service (http://igscb.jpl.nasa.gov). This service is the backbone of modern precision geodesy. The available multiyear archival data can be used to search for transient variations of fundamental constants associated with the galactic motion through the dark-matter halo (see http://www.dereviankogroup.com/gps-dm/).

The field of precision measurement appears to be undergoing a paradigm shift, with new theoretical and experimental ideas sprouting almost daily. For instance, reanalysis of data from using atomic dysprosium to look for the variation of the fine-structure constant and to test Lorentz invariance has set new limits on the scalar dark matter.3,4 That has been made possible by the existence of well-documented, accessible data sets stored electronically.

An example of a new experimental idea is using precise beam-position monitors in particle accelerators to test for specific types of Lorentz-invariance violations.5

Inspired by all those exciting developments, we propose that data streams from any ongoing precision measurements be time-stamped and stored for possible future analysis. We are convinced that the cost of data storage and GPS timing is relatively small and that the data storage will be straightforward to implement technically, though, of course, the price and complexity crucially depend on the precision of the time stamp and the data rate. Conversely, failing to time-stamp and store the data is likely to be an enormous waste. The search for transient effects of the dark sector is already a good motivation to create a data archive, and additional ideas of how to use such data are likely to emerge in the future.

What information should be time-stamped and recorded as a raw data stream? Data from optical and matter interferometers, experiments measuring parity violation and looking for permanent electric dipole moments, precision-measurement ion traps, all precision experiments with antimatter, and, by default, anything measured precisely.

We live in the age of Google and GPS; our thinking about experimental data should be keeping up with the times!

The postdoc will be located at the University of Nevada, Reno and will be directly collaborating with Dr. Andrei Derevianko (Physics) and Dr. Geoffrey Blewitt (Nevada Geodetic Laboratory). Strong computational skills and familiarity with statistical analysis are preferred.

]]>679Dark matter search with GPS: Q&Ahttp://www.dereviankogroup.com/dark-matter-search-gps-qa/
Thu, 20 Nov 2014 18:22:33 +0000http://www.dereviankogroup.com/?p=621In the aftermath of our paper (with Maxim Pospelov) "Hunting for topological dark matter with atomic clocks" having been published, there were quite a number of e-mails with questions about our proposal. There was even an offer for a free-of-charge use of a powerful computational cluster (thank you!). I apologize for not answering all e-mails individually - just […]

]]>In the aftermath of our paper (with Maxim Pospelov) "Hunting for topological dark matter with atomic clocks" having been published, there were quite a number of e-mails with questions about our proposal. There was even an offer for a free-of-charge use of a powerful computational cluster (thank you!). I apologize for not answering all e-mails individually - just not enough time. One of my friends has also sent me a link to this reddit thread - there is a genuine interest to the details of the proposal. This post is intended to answer some of these questions.

First of all see the previous post that outlines the basic idea of the search.

Topological dark matter:
There are two components that go into dark-matter model building: (i) what the dark matter objects are and (ii) how these objects interact non-gravitationally with us (baryonic or ordinary matter). I emphasize the word non-gravitationally, as the gravitational interaction is a must due to multiple observations of gravitational interactions between dark and ordinary matter (and consistency with general relativity).

Additional model constraints come from various observations and cosmological simulations. Still the allowed parameter space is enormous: even if one were to assume that the dark matter objects are made out of elementary particles, the allowed masses span 50 orders (!) of magnitude. This is just a testament to the current state of confusion in modern physics and cosmology. The field is ripe for discoveries.

First of all I admit that our model (due to Maxim Pospelov) is speculative, but it is as good as any model out there. WIMPs and axions have additional attractive features as they also might solve other outstanding problems in physics (for example, strong-CP problem in physics can be solved with axions).

So what is the model? (here you might get lost, just read on). For experts, technical discussion can be found in the extensive supplementary material to our paper.

Well, you start with a quantum field and this field has some self-interaction built in. The interaction is such that it allows for several identical minima. For example, the same value of potential minima could be reached at two distinct values of the field +A and -A. Now when the Universe expands it cools down and the field has to settle at the minima of the potential. The field is torn apart by which value to chose - the choice of +A or -A are equivalent. So in some regions of space it picks +A and in the other regions it picks -A. This is called "spontaneous symmetry breaking".

Nature does not like discontinuities and you have to smoothly connect +A and -A domains. This transition region is the topological defect or cosmic wall. The thickness of the wall is given by the particle Compton wavelength = h/(m c), where m is the particle mass, h is the Plank constant and c is the speed of light.

This example is overly-simplistic but it demonstrates the idea of how topological defects are formed as the Universe cools down: in fact, for a dark-matter model you would like to have the field to be zero everywhere except inside the defects (see the supplement). All the energy (or mass) is stored in topological defects.

Depending on the field's degrees of freedom (scalar vs vector fields) and the self-interaction potential one may form defects of various geometry: monopoles, strings or domain walls. Especially interesting is the case of monopoles (spherically-symmetric objects) as the gravitationally-interacting gas of monopoles mimics dark matter. The size of the defect is a free parameter - we do not have constraints on how large it could be. GPS would be sensitive to Earth-sized monopoles (huge Compton wavelength translating into particle mass ~10^-14 eV).

Here is a real-life example of spontaneous symmetry breaking and topological defects (due to Rafael Lang, the interview to appear in Sensing Our Planet magazine)

“There’s a wedding and a hundred people are sitting at this big round table. Somebody starts eating the salad. They pick up the fork on their left, so the person next to them has to pick up the fork on the left. Now the bride also starts eating, picking up the fork on the right, so everybody around the bride picks up the right fork. At some point in between this poor guy will be sitting with no fork; on his other side will be someone with two forks. Those two guys are called a topological defect. There’s nothing special going on around the left, the right, but where those two guys are sitting, there’s a disruption of the forks.”

Ok so we are done with choosing dark-matter objects. Now the second ingredient is the non-gravitational interaction between dark matter objects and us. Here you do need to pick one that is "reasonable" (e.g., Lorentz-invariant) and is sufficiently weak that it went unnoticeable in dedicated experiments and observations. The interaction that we picked is of this kind. Effectively when the defect overlaps with us, it pulls on the particle (electron, proton, neutron, etc) masses and forces acting between the particles. Mind you this pull is really weak, otherwise we would have noticed it. However, there are ultra-sensitive devices, like atomic clocks (see this post) that may be sensitive to such pulls. You might ask - why it might have gone unnoticed before in atomic clocks - some of the reasons are purely psychological and are related to how an experimentalist discerns signal from noisy background (see this post.)

]]>621Migdal's advicehttp://www.dereviankogroup.com/migdals-advise/
Sun, 09 Nov 2014 16:21:07 +0000http://www.dereviankogroup.com/?p=612A common mistake of beginners is the desire to understand everything completely right away. In real life understanding comes gradually, as one becomes accustomed to the new ideas. One of the difficulties of scientific research is that it is impossible to make progress without clear understanding, yet this understanding can come only from the work itself; every completed piece of […]

A common mistake of beginners is the desire to understand everything completely right away. In real life understanding comes gradually, as one becomes accustomed to the new ideas. One of the difficulties of scientific research is that it is impossible to make progress without clear understanding, yet this understanding can come only from the work itself; every completed piece of research represents a victory over this contradiction.

]]>612Nature bibliography style bst-file with no URLshttp://www.dereviankogroup.com/nature-bibliography-style-bst-file-urls/
Wed, 24 Sep 2014 16:00:52 +0000http://www.dereviankogroup.com/?p=592If you ever went through the grueling process of re-formatting your LaTeX manuscript to match Nature style, you would appreciate this little trick. A typical LaTeX distro includes nature package with nature.bst file included. *.bst files govern how the BibTeX bibliographies are formatted. The nature.bst file does not handle URLs well and Peter Komar has hacked the […]

]]>If you ever went through the grueling process of re-formatting your LaTeX manuscript to match Nature style, you would appreciate this little trick.

A typical LaTeX distro includes nature package with nature.bst file included. *.bst files govern how the BibTeX bibliographies are formatted. The nature.bst file does not handle URLs well and Peter Komar has hacked the original nature.bst file to remove URLs from the bibliography.

Peter has generously agreed to share his hack with the community. The file naturemag_noURL.bst can be downloaded here.

The easiest solution is to place this file into the directory with your manuscript files and to add \bibliographystyle{naturemag_noURL} to your tex document.

]]>592The world's timehttp://www.dereviankogroup.com/worlds-time/
http://www.dereviankogroup.com/worlds-time/#commentsWed, 10 Sep 2014 14:53:34 +0000http://www.dereviankogroup.com/?p=579I was asked to write a news story for the American Physical Society's Forum on International Physics newsletter. Here is my contribution. As I type this text away, I become aware that time just continues its quiet flow and my keyboard clicks measure its passage. And whatever poetical, philosophical or religious meaning one might assign […]

]]>I was asked to write a news story for the American Physical Society's Forum on International Physics newsletter. Here is my contribution.

As I type this text away, I become aware that time just continues its quiet flow and my keyboard clicks measure its passage. And whatever poetical, philosophical or religious meaning one might assign to the “time”, that is how the time is defined: as a measurable sequence of events.

And time being measurable naturally means that physicists are in business.

Atomic clocks are arguably the most accurate devices ever built. While a typical wristwatch keeps time accurate to about a second over a week, modern atomic clocks aim at neither gaining nor loosing a second over the age of the Universe. Imagine that if some poor soul were to build a clock like that at the beginning of time, at the Big Bang, and for some good reason it were to survive through all the cosmic cataclysms, today it would be off by less than a heartbeat.

Atomic clocks are ubiquitous and one could buy a slightly used one on the internet. Among many places, they tick away on stock exchanges, in data centers, and in the hearts of GPS satellites. However, there is a truly special collection of several hundred atomic clocks distributed among 50 or so industrialized countries that defines the world’s time. This timescale is known as the TAI (from the French “Temps Atomique International”) or the international atomic time.

A collection of atomic clocks at the Physikalisch-Technische Bundesanstalt (PTB), Germany. These clocks substantially contribute to the TAI timescale, the world’s time. Credit: PTB

BIPM (Bureau International Des Poids Et Mesures) is at the heart of defining the world’s time. This international organization is located in a white wooden two-story building on the forested bank of the Seine River in the Parisian suburbs. Judah Levine from NIST-Boulder explains that BIPM was established in 1875 by the international “Treaty of the Meter” which defined the kilogram and the meter. Later the second was added to the convention (SI units) and the meter redefined in terms of the fixed speed of light and the second. The modern legal definition of the second involves a certain number of beats derived from the hyperfine splitting of cesium-133 atom.

Judah Levine has been contributing US data to TAI for nearly half a century. He explains that BIPM collects clock data from metrology labs and averages them. Then BIPM distributes a document called “the Circular T” which tells by how much the national timescales were off from the average about a month ago. In turn, based on this circular, national labs steer their local timescales to account for the drifts from the TAI. Such a protocol keeps the world’s time stable at the level of a nanosecond over a month.

The most advanced metrology labs rely on the so-called primary frequency standards, super-precise cesium clocks, says Peter Rosenbusch of the Laboratoire Nationale de Métrologie et d'Essais and the Paris Observatory. The primary standards are occasionally used to calibrate other local “workhorse” continuously-run atomic clocks to the SI definition of time as close as possible. In the US, the primary frequency standard is the cesium fountain clock at NIST-Boulder.

So if the world’s time is the time counted by atomic clocks, is it the same as the cosmic time? In principle one could measure time using pulsars, magnetized rotating neutron stars. The pulsars, however tend to slow down over time due to the gravitational wave radiation, and, moreover, Judah Levine points out that the very shape of the pulses also changes over time making counting the pulses imprecise. We joke that, perhaps, to define the Standard Galactic Time one needs to find more stable cosmic sources.

Nevertheless, space and satellite technology are anticipated to improve TAI. Christophe Salomon of Ecole Normale Supérieure in Paris is involved with the ACES (Atomic Clock Ensemble in Space) mission of the European Space Agency. He explains that the goal is to operate the most precise primary Cs frequency standard onboard the International Space Station (ISS). The clock is expected to become operational in space in two years. ISS would broadcast a microwave time signal down to several Earth-based stations. In the USA, the stations will be installed at JPL in Pasadena and NIST-Boulder. Through the ACES mission, national labs around the globe will establish high precision links to compare primary standards and thus remove some uncertainties in their contributions to the world’s time.

Neither time nor its definition is still. There are new generations of atomic clocks based on ultracold atoms and ions that already outperform the primary frequency standards. Pushing these quantum devices to their limits is a friendly competition between several labs around the world. Just over the past year the crown of being the world’s most precise clock has been shared by USA (two teams at JILA and NIST-Boulder), Japan, and Germany. These advances have been summarized in recent talks by E. Peik (PTB, Germany) and A. Ludlow (NIST-Boulder, USA) at the International Conference on Atomic Physics held last July in a historic Mayflower hotel in Washington D.C.

Considering this rapid progress in atomic horology, the international community discusses how to redefine the second in terms of these novel classes of clocks. This means retiring Cs from the SI units and redefining the world’s time.

Also the clock comparison technology improves. The European Union is building a trans-European clock network using existing optical fiber communication links to compare clocks at metrology labs directly, removing the uncertainties of the over-the-air and over-the-space comparisons. The first 920 km-long link between the northern and southern parts of Germany has been already tested.

One of the apparent limitations of the TAI timescale is that it is a “paper timescale” – it only shows what the world’s time was a month ago. What if the dedicated clocks were compared and averaged continuously or even better they formed one single geographically distributed clock? This was envisioned recently by a group of physicists led by Mikhail Lukin at Harvard and Jun Ye at JILA in Colorado. They proposed a quantum network of atomic clocks (for example, placed on satellites orbiting the Earth) that would utilize quantum entanglement to create one giant distributed clock with each nation contributing satellites to the network. Jun Ye comments, “this is definitely a futuristic proposal, and we must achieve substantial technological advances. However, all of the different building blocks for the network have in principle been demonstrated in small scales.” May be this is how the world’s time would be measured in the far future.

I would like to also thank Jeff Sherman of NIST-Boulder, Ekkehard Peik of PTB, and Peter Komar of Harvard for illuminating discussions.

About the author: Dr. Andrei Derevianko is a Russian-American theoretical physicist and a professor at the University of Nevada, Reno. He has contributed to the development of several novel classes of atomic clocks and precision tests of fundamental symmetries with atoms and molecules.

]]>http://www.dereviankogroup.com/worlds-time/feed/1579Fundamental physics at the precision frontier: questions to ponderhttp://www.dereviankogroup.com/fundamental-physics-precision-frontier-17-new-questions-ponder/
Mon, 07 Jul 2014 18:24:58 +0000http://www.dereviankogroup.com/?p=575The closing session of the Perimeter workshop on "New ideas in low-energy tests of fundamental physics" was a stimulating discussion on open questions at the intersection of precision measurements and fundamental physics. The discussion was guided by Derek Kimball's list of questions that he kindly shares with you below. The video/audio record of the entire discussion can be found online […]

]]>The closing session of the Perimeter workshop on "New ideas in low-energy tests of fundamental physics" was a stimulating discussion on open questions at the intersection of precision measurements and fundamental physics. The discussion was guided by Derek Kimball's list of questions that he kindly shares with you below. The video/audio record of the entire discussion can be found online here: part1 and part2.

Are there boring answers to exciting mysteries?

If one assumes, from the experimental perspective, the most boring solutions to mysteries: for example, a cosmological constant driving the accelerating expansion of the universe and dark matter that has no couplings to Standard Model particles, what mysteries still cannot be resolved?

Technical naturalness: for now should we not be overly concerned about this issue for experiments?

Hierarchy problem and its relation to the observed Higgs mass, cosmological constant, BICEP-2, Planck scale: how does this relate to the scale of new physics and where we should search?

New (?) idea of searching for fast-varying constants: could this be done in an astrophysical spectroscopic search?

It was noted that a phase transition process (or evolving couplings) could be introduced “avoid” technical naturalness problems... could there be phase transitions with very small effects that occur frequently, perhaps even today? (Something for GNOME or clock networks to look for?)

Impact of BICEP-2 results

BICEP-2 results: if assumed to be correct, what do they imply about the best regimes/scenarios/experiments to search for new physics?

Does BICEP-2 imply that lots of interesting new physics stuff inflates away?

How plausible is scale evolution of physics to avoid BICEP-2 “problems” and what are experimental signatures of scale evolution?

Relation between astrophysical and laboratory searches

Ideas like chameleon fields: what kind of mechanisms exist to hide interactions in laboratory tests and allow astrophysically, or allow in laboratory tests and hide astrophysically? How plausible are these, and how seriously should constraints be taken?

What is the state of knowledge about coupling between dark matter particles? Would coupling between DM particles make some difference between laboratory vs. astrophysical bounds? What if DM is more complex (not just one species) and 5% is coupled strongly to itself: could one have, for example, axion stars, etc.? Could such objects give transient signals?

Transient and time-dependent new physics signals

What kind of new physics can GNOME, clock network, CASPEr, or related experiments access that laboratory experiments cannot access?

Is there anything that can be said about scale of domains, time between transient signals?

Higher dimension topological defects and textures were mentioned. What are these, and what are interesting signatures and characteristics?

It was noted that the photon mass could be altered inside a topological defect: could this be measured with the GNOME or the clock network experiment?

Symmetry tests of gravity

How do we test if standard gravity violates parity or time-reversal invariance?

What is the impact of the ThO electron EDM constraint (also Hg and neutron EDM limits) on new physics scenarios?

Dark Energy (DE)

What are the range of viable ideas outside of the cosmological constant, and among these which have the best motivation? What “hand-waving arguments” motivate where to search?

What is relation of inflation to CP problem and baryogenesis (does CP-violating inflaton do anything, or are Sakharov conditions not satisfied)?

What is connection between inflaton and dark-energy?

Non-quantum fields?

It was noted that loop corrections complicate the physics of light scalar fields. Can one imagine “non-quantizable” fields (something like gravity that can’t be quantized simply)? For example, torsion or chiral gravity that is not quantized (at least in the usual way)?

Do such fields have distinct signatures compared to quantized spin-0, spin-1 quantum fields? What is plausibility, for example, of long-range torsion gravity?

Transients in astronomical spectroscopy

Could we detect new physics “passing through” the line of sight between earth and astrophysical object?

Could we search for transients using a telescope? There was a suggestion to use an astro-comb…

For example, it was suggested that quintessence field coupled electromagnetically and generated Faraday rotation: if \phi is clumped or forms topological defect, is this something observable?

Impact of proton radius measurements

What kind of new physics might it imply?

Hidden sector

There are previous and future tests of the spin-statistics connection being conducted. Could these be sensitive to some of the hidden sector physics?

What are observable signatures of hidden sector supersymmetry?

Large extra dimensions

What is the present status and is there strong motivation to go to particular length scales in tests? Is 100 microns a special length? What would show up in atoms?

Experimentally, what is the status of patch potential systematics?

Lense-Thirring effect for intrinsic spins

Comparison of Lense-Thirring effect for intrinsic spin vs. orbital angular momentum? Is there a way to test?

]]>575Algorithmic approach to scientific writing style: the structure of a paragraphhttp://www.dereviankogroup.com/algorithmic-approach-to-scientific-writing-style-structure-of-a-paragraph/
Wed, 02 Jul 2014 21:16:12 +0000http://www.dereviankogroup.com/?p=562With three of my students writing their theses this year, I decided to formalize some advice on a clear writing style. Yes, writing is an art form, yet I find that following these simple rules would produce understandable technical writing - a vast improvement over not following rules at all. Here are my notes on how to […]

]]>With three of my students writing their theses this year, I decided to formalize some advice on a clear writing style. Yes, writing is an art form, yet I find that following these simple rules would produce understandable technical writing - a vast improvement over not following rules at all.

Here are my notes on how to make a paragraph flow.

Pick a keyword/phrase/concept/idea that you would want to focus in the paragraph. This keyword should remain the focus of individual sentences throughout the paragraph. This simple trick helps the paragraph “flow”.

The first sentence of a paragraph should announce what you intend to communicate in the paragraph. For example, “Below we show that …”. Remember you are writing to be easily understood.

The last sentence should summarize the paragraph and possibly pre-announce what would happen in the following paragraph tying the paragraphs together.

"Square" rule - typically a paragraph should not occupy more than a "square" on a printed page, i.e., the height of the paragraph should not exceed the column width. Shorter paragraphs are fine. If longer, strongly consider breaking the paragraph in two.

Read the paragraph ALOUD - even a non-native speaker would be able to tell if the writing "feels" right.

]]>562When would an unanticipated “new physics” event be apparent to an unsuspecting experimentalist?http://www.dereviankogroup.com/unanticipated-new-physics-event-apparent-unsuspecting-experimentalist/
Mon, 09 Jun 2014 04:43:14 +0000http://www.dereviankogroup.com/?p=557Suppose an experimentalist has a sensitive device, the conventional physics of which is well under control. Now let’s assume that once in while the device is perturbed by some unanticipated “new physics” events, such as an interaction with a lump of dark matter. Suppose the device has enough sensitivity to “new physics”. When would an […]

]]>Suppose an experimentalist has a sensitive device, the conventional physics of which is well under control. Now let’s assume that once in while the device is perturbed by some unanticipated “new physics” events, such as an interaction with a lump of dark matter. Suppose the device has enough sensitivity to “new physics”. When would an unanticipated “new physics” event become apparent to an unsuspecting experimentalist?

This is quite different from particle colliders, when experimentalists do hunt for unusual events. Sometimes specific signal or signature is anticipated (e.g., the Higgs), so let me emphasize that our unsuspecting fellow experimentalist does not specifically look for new physics.

So when would “new physics” be noticed? An obvious answer would be: when the new physics perturbs the expected signal in a significant way.

Even this simple statement requires qualifiers. Suppose new physics provides a uniform background to the signal and the signal itself cannot be computed exactly from the first principles. For example, transition frequencies of many-electron atoms can be computed only to 3-4 significant figures while experimentalists can determine some of these frequencies to 18 significant figures. Then (unless there are symmetry arguments, e.g., parity or time-reversal violation and associated external field reversals implemented in an experiment) there is no way to dissect the new physics background from the conventional one.

This leaves us with time/space-dependent new physics signals. I.e., new physics could be noticed if it leads to some noise or drift in time/space-dependent signals. For example, a uniform-in-time drift in atomic frequencies could reveal variation in fundamental constants.

What about “new-physics” noise/spike-like events, such as the perturbation by "lumps" of dark-matter? Suppose the conventional signal is interrupted by new physics events. We could characterize such events by how long an event lasts (short/long interaction times), how frequent the events are (rare/frequent) and if the device is sensitive to the event. For simplicity we assume that the events do not overlap, i.e., the average time between the individual events is much longer than the event duration.

The event would be missed if the average time between consecutive events is larger than a typical time of continuous operation of the device (i.e., the events are rare). Unfortunately, a single bona fide event could be discarded by an experimentalist as being an outlier. Indeed, one can never guarantee that everything is fully under control, as there still may be occasional perturbations present, such as a student bumping into an optical table or misbehaving power supplies.

Essentially, rare events would be registered as such only if they are anticipated.

This argument brings us to the following conclusion: unanticipated “new physics” events would be noticed only if the sizable events are frequent on time-scale of the experiment.

There is another caveat: suppose the events are so frequent that they look like a white or a flicker noise in the signal. After all it is natural to assume Poissonian distribution of time intervals between consecutive events. Then there is a danger of “integrating out” the events.

Thereby we have to revise our statement: unanticipated “new physics” events would be noticed only if the sizable events are frequent (but not too frequent) on the time-scale of the experiment.

In practical terms, for a typical atomic physics experiment, the events should last longer than a second and there should be hundreds of them per day of operation. Even then, the experimentalist should be gutsy enough to put his/her credibility and comfort at risk and publicly report the data as being unusual. “New physics” looks for the right fellow to notice and appreciate it.

P.S. I would like to thank Dima Budker and Jeff Sherman for discussions on this topic

]]>557Workshop on New Ideas In Low-Energy Tests Of Fundamental Physicshttp://www.dereviankogroup.com/workshop-new-ideas-low-energy-tests-fundamental-physics/
Thu, 20 Mar 2014 17:32:29 +0000http://www.dereviankogroup.com/?p=550I would like to announce a workshop at the intersection of atomic physics with particles and fields (Ok, it has some elements of cosmology too). The workshop is to be held at the Perimeter Institute in mid-June of 2014. Having visited Perimeter for a couple of weeks, I highly recommend its stimulating environment. Here is the […]

]]>I would like to announce a workshop at the intersection of atomic physics with particles and fields (Ok, it has some elements of cosmology too). The workshop is to be held at the Perimeter Institute in mid-June of 2014. Having visited Perimeter for a couple of weeks, I highly recommend its stimulating environment. Here is the announcement:

NEW IDEAS IN LOW-ENERGY TESTS OF FUNDAMENTAL PHYSICS

The purpose of the workshop is to bring together members of theoretical and experimental communities interested in finding new fundamental applications to continuing advancement of new high-precision tools in AMO physics. The foci of the workshop will include novel approaches to searches for axions, axion-like particles and other light exotic fields, which can serve as dark matter candidates; new ideas in application of the networks of time-correlated devices (atomic magnetometers, atomic clocks etc.); new ways of testing properties of gravitational interactions and fundamental constants as well as developing new gravitational wave detectors.