I am a quantum space-time phenomenologist. I study the question of whether the space-time we live in is a definite reality that is a background on which all matter and light exist, or if it is an emergent phenomenon from a fundamentally indeterminate quantum system. Is the fabric of space-time absolute, or does it arise only as a specific observation made by a specific observer, with its true nature probabilistic and woven out of relationships among the building blocks of reality? I study this question from an empirical perspective, constructing models that attempt to describe actual experiments and observations, designing research programs to test those models, interpreting data to understand what it tells us about the underlying laws and principles, and sometimes even taking a role in the commissioning and operations of an experiment. Currently, I am carrying out the latest second-generation Holometer research program at Fermi National Laboratory, which measures tiny fluctuations of space-time at unprecedented precision (1/100,000,000 of a single atom!) to see if its background fabric is very subtly twisting due to quantum indeterminacies.

My research aims to address a number of fundamental mysteries. One is the holographic universe: Traditionally, it was widely thought that a breakdown of the familiar smooth space-time we experience happened only at the Planck scale, 10⁻⁴⁴ seconds. But we have since learned that black holes, the densest objects possible in general relativity, have information contents that are merely 2-dimensional. If we had a 3+1 dimensional background of space-time that is smooth all the way down to Planck scale resolution, even "empty" vacuum would hold a huge information content —​ at just the radius of Hawaii (Big Island), denser than a black hole! So the entire universe must somehow be a hologram with just 2 dimensions' worth of information, implying that the uncertainties in its background fabric are much larger. Another significant topic is "dark energy": The universe is expanding at an accelerating rate, with an unknown energy driving it. A common hypothesis is that this is the energy of empty space-time, but an estimate using standard physics would yield a value that is too large by 10¹²² — the worst failed calculation in physics. This anomalously "fine-tuned" small value is critical for the universe to come to its structured existence! We think that a reformulation using our models of quantum space-time may address the incorrect scaling. Lastly, I am interested in foundational problems in quantum mechanics: There have been many attempts to rigorously test whether quantum indeterminacies are fundamental features of reality, or merely reflections of our ignorance of the underlying total information available in the universe, but none of the mathematical models used to describe these experiments have included a quantum space-time as part of the physical system. While we are nowhere close to closing this loophole, an empirical verification of my models at the Holometer would take us a step closer.

In our understanding of the laws of the universe, we have two great pillars: Quantum Mechanics (QM) and General Relativity (GR). Collectively, they explain all phenomena ever observed — QM explains the physics of matter and light, and GR explains the nature of space-time and gravity — to stunning levels of precision. But the two theories are in foundational conflict. QM is built out of discrete and fundamentally indeterminate probabilistic states that are not localized in space and time; at its core lies a rejection of local realism. GR is constructed from a continuum of definite points in space-time, based on a principle of invariance, that there exists an absolutely consistent reality that is independent of an arbitrary choice of coordinates. How to reconcile the two?

​For decades, the community has relied heavily on local quantum field theory, which compromises both sets of general principles to incorporate quantum mechanics and special relativity into one framework. Our progress in this direction has been limited by the lack of connections to any empirical data. Perhaps, in seeking new data to guide us, it is time to address the fundamental limitations of our presupposed frameworks, and attempt to directly connect foundational principles to phenomenological signatures.

​Tracing ourselves back to issues Einstein grappled with, in QM, the content of the theory only has physical meaning in the context of a measurement, manifested as correlations between observables. The observer and the events occurring are typically placed on an absolute background of space-time, but GR tells us that our physics must be independent of such an entity, as exemplified by the gravitational field being the background space-time itself. Space-time, too, must be part of the quantum system that has events and observers, with its fabric emerging out of relationships between subsystems, beyond perturbations on a background metric.

​Since 2011, I have been working to better understand what such a phenomenology would look like, build heuristic models of it, and develop experimental design concepts and statistical frameworks for data interpretation. Since 2016, I have also been working to test my models by commissioning, operating, and analyzing data from the Holometer second-generation experiment at Fermilab.

​In standard physics, this interplay between QM and GR is expected only at the Planck scale, far out of experimental reach at 10⁻⁴⁴ seconds, requiring the extremely high energies of black holes or the early universe for any observational evidence. However, there is reason to believe this is an artifact of the limitations of our mathematical frameworks. A most profound discovery in recent decades is the fact that the entropy of a black hole — a measure of its information content — is 2-dimensional. We know that every bit of information, every independent degree of freedom, is connected to a finite amount of energy. A system in standard physics built upon a 3+1 dimensional background space-time simply contains vastly more degrees of freedom than a 2-D system, and even an "empty" 3-D sphere of vacuum barely large enough to cover Hawaii (Big Island) would exceed the gravitational binding energy needed to create a black hole. This led to the "holographic principle" — that the true information content of the universe is 2-D.

​In a ​holographic universe, the information density decreases linearly with system size. This means that at larger scales, there are fewer independent degrees of freedom, and more quantum correlations. Quantum mechanics provides us with a mechanism by which this may be possible: entanglement — Einstein's "spooky" action at a distance. In entanglement, multiple physical entities may share a common degree of freedom as a single system extended across arbitrarily large distances, so their observables can be correlated. This still preserves the structure of causality in space and time, because the information accessible to each observer in each measurement is restricted in a way that prohibits sending signals faster than the speed of light. I have been using the same principles to estimate possible correlation signatures in a space-time that is itself constructed out of quantum entanglement. I assume no background, and model space-time itself as many quantum observables, with its fabric emerging from entanglement relationships that follow causal structures which govern what information each observer can access in a proper time interval.

​It is naturally quite difficult to create a model that is consistent with all empirically verified principles, such as Lorentz invariance. We want to recover locality in a consistent way, with "spooky" signatures of entanglement in space-time not locally observable and emerging only when two measurements are correlated across a spacelike separation. We believe we finally have such a model. The predicted quantum indeterminacy in space-time is much larger than the Planck scale, but still scales to less than 10⁻⁶ meters across the distance from Earth to the Andromeda Galaxy. Amazingly, thanks to the technology developed by LIGO for the detection of gravitational waves, this effect might be observable in an experiment measuring only 40 meters! During the years that it took us to build this model, my collaborators commissioned and completed a first-generation Holometer project that was based on an early hypothesis (without a mature model), which obtained a remarkably clean null result that demonstrated a sensitivity an order of magnitude better than what is needed. Since then, I have devoted my time to the experimental effort, reconfiguring the apparatus to test my new models (against standard physics and against the earlier null result) and collecting and analyzing the data.

Our upcoming results will potentially shed light on a number of important questions​. One is the accelerating expansion of the universe. The "dark energy" that drives this acceleration is a factor of 10¹²² smaller than what we would expect from standard physics, and this anomalously "fine-tuned" value seems necessary for a structured universe to exist. We may be able to explain this value if the vacuum of space-time embodies correlations similar to our models. The results are also relevant to the foundations of quantum mechanics. Rigorous tests have been conducted of whether the indeterminacies and nonlocalities of quantum mechanics are fundamental and ontic, or merely epistemic uncertainties due to our ignorance, but the mathematical frameworks used to make "loophole-free" arguments do not include the background space-time in their quantum systems. I seek to address that.

​The nonlocal quantum correlations in our phenomenology have also been applied to an inflationary space-time, and are predicted to be observable in large-angle CMB anisotropies and large scale structure (DES). I will be joining efforts in this direction soon.

​Stay tuned. This is an exciting time for empirical science. We expect to release official public results by the second half of 2019.