Pages

Thursday, September 11, 2014

Experimental Search for Quantum Gravity – What is new?

Last week I was at SISSA in Trieste for the 2014 conference on “Experimental Search for Quantum Gravity”. I missed the first two days because of child care problems (Kindergarten closed during holiday season, the babysitter ill, the husband has to work), but Stefano Liberati did a great job with the summary talk the last day, so here is a community update.

The briefest of brief summaries is that we still have no experimental evidence for quantum gravity, but then you already knew this. During the last decade, the search for experimental evidence for quantum gravity has focused mostly on deviations from Lorentz-invariance and strong quantum gravity in the early universe that might have left imprints on the cosmological observables we measure today. The focus on these two topics is still present, but we now have some more variety which I think is a good development.

There is still lots of talk about gamma ray bursts and the constraints on deformations of Lorentz-invariance that can be derived from this. One has to distinguish these constraints on deformations from constraints on violations of Lorentz-invariance. In the latter case one has a preferred frame, in the former case not.
Violations of Lorentz-invariance are very strongly constrained already. But to derive these constraints one makes use of an effective field theory approach, that is one assumes that whatever quantum gravity at high energies (close by the Planck scale) looks like, at small energies it must be describable by the quantum field theories of the standard model plus some additional, small terms.

Deformations of Lorentz-symmetry are said to not have an effective field theory limit and thus these constraints cannot be applied. I cautiously say “are said not to have” such a limit because I have never heard a good argument why such a limit shouldn’t exist. For all I can tell it doesn’t exist just because nobody working on this wants it to exist. In any case, without this limit one cannot use the constraints on the additional interaction terms and has to look for other ways to test the model.

This is typically done by constraining the dispersion relation for free particles which obtains small correction terms. These corrections to the dispersion relation affect the speed of massless particles, which now is energy-dependent. The effects of the deformation become larger with long travel times and large energies which is why high energetic gamma ray bursts are so interesting. The deformation would make itself noticeable by either speeding up or slowing down the highly energetic photons, depending on the sign of a parameter.

Current constraints put the limits roughly at the Planck scale if the modification is either to slow down or to speed up the photons. Putting constraints on the case where the deformation is stochastic (sometimes speeding up, sometimes slowing down) is more difficult and so far there haven’t been any good constraints on this. Jonathan Granot briefly flashed by some constraints on the stochastic case, but said he can’t spill the details yet, some collaboration issue. He and collaborators do however have a paper coming out within the next months that I expect will push the stochastic case up to the Planck scale as well.

On the other hand we heard a talk by Giacomo Rosati who argues that to derive these bounds one uses the normal expansion of the Friedmann-Robertson-Walker metric, but that the propagation of particles in this background should be affected by the deformed theory as well, which weakens the constraints somewhat. Well, I can see the rationale behind the argument, but after 15 years the space-time picture that belongs to deformed Lorentz-invariance is still unclear, so this might or might not be the case. There were some other theory talks that try to get this space-time picture sorted out but they didn’t make a connection to phenomenology.

Jakub Mielczarek was at the meeting talking about the moment of silence in the early universe and how to connect this to phenomenology. In this model for the early universe space-time makes a phase-transition from a Euclidean regime to the present Lorentzian regime, and in principle one should be able to calculate the spectral index from this model, as well as other cosmological signatures. Alas, it’s not a simple calculation and progress is slow since there aren’t many people working on it.

Another possible observable from this phase-transition may be leftover defects in the space-time structure. Needless to say, I like that very much because I was talking about my model for space-time defects that basically is a parameterization of this possibility in general (slides here). It would be great if one could connect these parameters to some model about the underlying space-time structure.

The main message that I have in my talk is that if you want to preserve Lorentz-invariance, as my model does, then you shouldn’t look at high energies because that’s not a Lorentz-invariant statement to begin with. You should look instead at wave-functions sweeping over large world-volumes. This typically means low energies and large distances, which is not a regime that presently gets a lot of attention when it comes to quantum gravity phenomenology. I certainly hope this will change within the next years because it seems promising to me. Well, more promising than the gamma ray bursts anyway.

We also heard Joao Magueijo in his no-bullshit style explaining that modified dispersion relations in the early universe can reproduce most achievements of inflation, notably the spectral index including the tilt and solving the horizon problem. This becomes possible because an energy-dependence in the speed of light together with redshift during expansion turns the energy-dependence into a time-dependence. If you haven’t read his book “Faster Than the Speed of Light”, I assure you you won’t regret it.

The idea of dimensional reduction is still popular but experimental consequences, if any, come through derived concepts such as a modified dispersion relation or early universe dynamics, again.

There was of course some discussion of the BICEP claim that they’ve found evidence for relic gravitational waves. Everybody who cared to express an opinion seemed to agree with me that this isn’t the purported evidence for quantum gravity that the press made out of it, even if the measurement was uncontroversial and statistically significant.

As we discussed in this earlier post, to begin with this doesn’t test the quantum gravity at high energies but only the perturbative quantization of gravity, which for most of my colleagues isn’t really quantum gravity. It’s the high energy limit that we do not know how to deal with. And even to claim that it is evidence for perturbative quantization requires several additional assumptions that may just not be fulfilled, for example that there are no non-standard matter couplings and that space-time and the metric on it exist to begin with. This may just not be the case in a scenario with a phase-transition or with emergent gravity. I hope that next time the media picks up the topic they care to talk to somebody who actually works on quantum gravity phenomenology.

Then there was a member from the Planck collaboration whose name I forgot, who tried to say something about their analysis of the foreground effects from the galactic dust that BICEP might not have accurately accounted for. Unfortunately, their paper isn’t finished and he wasn’t really allowed to say all that much. So all I can tell you is that Planck is pretty much done with their analysis and the results are with the BICEP collaboration which I suppose is presently redoing their data fitting. Planck should have a paper out by the end of the month we’ve been told. I am guessing it will primarily say there’s lots of uncertainty and we can’t really tell whether the signal is there or isn’t, but look out for the paper.

There was also at the conference some discussion about the possibility to test quantum gravitational effects in massive quantum systems, as suggested for example by Igor Pikovski et al. This is a topic we previously discussed here, and I still think it is extremely implausible. The Pikovski et al paper is neither the first nor the last to have proposed this type of test, but it is arguably the one that got the most attention because they managed to get published in Nature Physics. These experiments are supposed to test basically the same deformation that the gamma ray bursts also test, just on the level of commutation relations in quantum mechanics rather than in the dispersion relation (the former leads to the latter, the opposite is not necessarily so).

The problem is that in this type of theory nobody really knows how to get from the one-particle case to the many-particle case, which is known as the ‘soccer-ball-problem’. If one naively just adds the energies of particles, one finds that the corrections blow up when one approaches the Planck mass, which is about 10-5 grams. That doesn’t make a lot of sense - to begin with because we wouldn’t reproduce classical mechanics, but also because quantum gravitational effects shouldn’t scale with the energy but with the energy density. This means that the effects should get smaller for systems composed of many particles. In this case then, you cannot get good constraints on quantum gravitational effects in the proposed experiments. That doesn’t mean one shouldn’t do the experiment. This is new parameter space in quantum mechanics and one never knows what interesting things one might find there. I’m just saying don’t expect any quantum gravity there.

Also at the conference was Jonathan Miller, who I had been in contact with earlier about his paper in which he and his coauthor estimate whether the effect of gravitational bremsstrahlung on neutrino propagation is detectable (we discussed this here). It is an interesting proposal that I spent quite some time thinking about because they don’t make giant leaps of faith about the scaling of quantum gravitational effects. In this paper, it is plainly perturbatively quantized gravity.

However, after some thinking about this I came to the conclusion that while the cross-section that they estimate may be at the right order of magnitude for some cases (I am not too optimistic about the exact case that they discuss in the paper), the total probability for this to happen is still tiny. That is because unlike the case of cross-sections measured at the LHC, for neutrinos scattering off a black hole one doesn’t have a high luminosity to bring up the chance of ever observing this. When I estimated the flux, the probability turned out to be too small to be observable by at least 30 orders of magnitude, ie what you typically expect for quantum gravity. Anyways, I had some interesting exchange with Jonathan who, needless to say, isn’t entirely convinced by my argument. So it’s not a settled story, and I’ll let you know what comes out of this.

Finally, I should mention that Carlo Rovelli and Francesca Vidotto talked about their Planck stars and the possible phenomenology that these could lead to. We previously discussed their idea here. They are arguing basically that quantum gravitational effects can be so that a black hole (with an apparent horizon, not an event horizon) does not slowly evaporate until it reaches the Planck mass, but suddenly explodes at a mass still much higher than the Planck mass, thereby releasing its information. If that was possible, it would sneak around all the issues with firewalls and remnants and so on. It might also have observable consequences for these explosions might be detectable. However, this idea is still very much in its infancy and several people in the audience raised concerns similar to mine, whether this can work without violating locality and/or causality in the semi-classical limit. In any case, I am sure that we will hear more about this in the soon future.

All together I am relieved that the obsession with gamma ray bursts seems to be fading, though much of this fading is probably due to both Giovanni Amelino-Camelia and Lee Smolin not being present at this meeting ;)

This was the first time I visited SISSA since they moved to their new building, which is no longer located directly at the coast. It is however very nicely situated on a steep hill, surrounded by hiking paths through the forest. The new SISSA building used to be a hospital, like the buildings that house Nordita in Stockholm. I’ve been told my office at Nordita is in what used to be the tuberculosis sector, and if I’m stuck with a computation I can’t help but wonder how many people died at the exact spot my desk stands now. As to SISSA, I hope that the conference was on what was formerly the pregnancy ward, and that the meeting, in spirit, may give birth to novel ideas how to test quantum gravity.

16 comments:

Hi Bee: Nice summary.But one can over criticize news media for not their fault. " I hope that next time the media picks up the topic they care to talk to somebody who actually works on quantum gravity phenomenology."If my memory serves right, when BICEP2 results came out, prominent physicists like Guth, Linde, Wilczek, Krauss and some others claimed that they confirm quantum gravity.

kashyap: That's exactly what I am criticizing. Journalists should at least try to bring some objectivity to the subject and that means they should ask experts in the field they report on. None of the people you name works on or has ever worked on quantum gravity phenomenology. They're all famous for something of course, but that doesn't mean they shit gold. I mean, why not ask Justin Bieber what he thinks of the BICEP results? Seriously, don't you think that certainly would bring up the page visits?

Yes, nice summary and flowers all around.I liked your slide presentation too and some of the ideas deep behind it.That Lorentz may be preserved for some reason could tend to an absolute value at the extreme scales of energy.There are always conflicts along the borders of new frontiers. Best

There is a quantum gravity effect which is measurable: the value of the cosmological constant (CC). In arxiv:1407.1394 M. Vojinovic and I showed that the effective CC is a sum of a classical CC plus quantum gravity CC plus mater CC in a quantum gravity theory proposed in arxiv:1402.4672. The classical CC can cancel the matter CC which leaves the quantum gravity CC, and this value is naturally small in 1/L_P^2 units.

If quantum gravity is not taken into consideration, one obtains the CC problem of extreme fine tuning. Namely, the effective CC is then a sum of the classical and the matter contribution, and one has to arrange CC(0) + CC(m) = 10^{-122} in 1/l_P^2 units, while CC(m) is by QFT given as a sum of terms of O(1) in 1/l_P^2 units. By taking into account the quantum gravity contribution, this problem is resolved by choosing CC(0) = -CC(m).

The quantum gravity theory in question is Regge quantum gravity with a special measure and the assumption that the spacetime is not a smooth manifold but it is a triangulation (PWL manifold) with a large number of 4-simplices. In such a theory there will be spacetime deffects associted with the edges and the corners.

Bee: I see your point.But an average journalist, who does not know much science, would not have time to look for some physicist somewhere doing quantum gravity phenomenology.If I were a journalist I would ask prominent people who make headlines in the newspapers.In my list, there is one Nobel prize winner, two potential Nobel winners and one goes all over the world lecturing about his book on cosmology! Perhaps the answer is that prominent people should be more cautious in their statements.

A. MikovicI like the abstract structures and arithmetic of your comment.It seems to me however, that a simplex (simplexes) are not the deepest foundations. So a lattice of them as a more general space has to at least involve 4D (albeit from an Euclidean stance) the self dual 24cell polytope. So concerning these dimensionless constants (CC or fine structure related) we multiply or divide by 4. We then get the double factorial group, 384 of the hypercube.But 2D point and edge representation is an astute start.The sum of orthogonal including maximum symmetry centered on the Monster group is equal to a product of them. But the question still remains how and if in these constraints remote or local structures in high or low energy or dimensions have limits as clear physics.

The quantum gravity effects are easy to detect (Podkletnov & Poher experiments, EM and Woodward drives verified with Chinese and NASA) and they represent the whole physics of Tesla scalar wave (i.e. antigravity beams, interactions of ferromagnetic monopoles and Dirac electrons with vacuum), negentropic phenomena (magnetic motors) and many other findings, which the physicists still manage to ignore completely. Various psychic phenomena (telekinesis, telepathy) belong here too.

Regarding the delay of gamma ray bursts, the quantum gravitists are fooled again: the faster low frequency photons are revolving these slower & heavier ones, so that they arrive at the same moment, despite they propagate with different speed.

"Many physicists accept quantum mechanics as a final theory and attempt to solve the open problems in physics and cosmology, such as unification and quantum gravity, within the existing framework of quantum physics. I worry that this is wrong and cannot succeed, because quantum mechanics itself must be radically deepened and completed to make further progress in our understanding of nature."

In particular, you cannot base unification with theories, the predictions of which differ in many orders of magnitude. It's evident, such a theories represent only an mutually insulated patches/perspectives of much wider framework.

So, official science doesn’t know what usual, observable, classical gravity really, physically is, how it can be inserted in the unified picture together with other interactions, particles, their properties, etc.

Next, it has no consistent idea about what “quantum gravity” can be within that classical gravity picture, nor even if real gravity quantization exists at all, in any known sense. Assumptions and “models” vary arbitrary between minus and plus infinity and are all hugely inconsistent.

And on the background of that absolute theoretical mist filled with heavy contradictions they open a vast program of experimental search for “I don’t know what”. The situation appears unprecedented and desperate. For in that situation any experimental result, positive or negative, will be doubtful, ambiguous and meaningless, par excellence, and will be unable to close the existing huge theoretical gaps in any case. This fundamental ambiguity is greatly amplified by extremely fine structure and small magnitude of expected effects, which produces various false eureka and huge failures of the last time (classical gravity wave detection among them) already in much simpler and theoretically (externally) transparent situations. Including, of course, “the great discovery of Higgs boson” that “provides the origin of mass”, but does not involve gravity (typical “completeness” of THAT, very special kind of science).

It is evident, that this is totally the problem of the theory, while experiments today are surprisingly fine and technically perfect. But no technical perfection can compensate for theoretical blindness and absence of elementary consistency. This is largely true for all modern experiments in fundamental physics (recall all growing “dark matters”, “hidden” dimensions and parallel universes), with extraordinary culmination in quantum gravity.

Conclusion: instead of vain, expensive searches without any reasonable perspective, one should concentrate on consistent, causally complete theory construction, and only then try further experimental search, now of provable sense and high efficiency. By the way, such theory will be largely confirmed already due to its completeness, i.e. unified and consistent explanation of ALL already known facts, which can hardly be a coincidence (contrary to separated “models” of usual theory).

Contrary to official statements, such causally complete theory is possible and exists (maybe even within a number of initially different approaches), rooted in the vision and tentative results of founding fathers of modern physics (today often neglected). The problem is that it must necessarily introduce a big change with respect to “officially recognised” (though definitely failing) approaches and models, and this is something “absolutely impossible” in modern science organisation, for purely subjective reasons.

Because otherwise they would organise only one kind of “event”, the brainstorming interactive work with the well-defined purpose of establishing the mentioned causally complete, intrinsically unified theory (in real time), instead of all those senseless and unique-resource-wasting amusements with ultimately ambiguous quantum gravity or even more grotesque quantum mysteries for science writers… Sorry for a constructive approach open to concrete possibilities of practical realisation.

Sabine,the comment for consideration of worlds within worlds... it seems that can be a part of the question of minimum and maximum lengths. Early on, in 1963 I have a photo where my friend had me mess my hair like Einstein and I drew my 4D chessgame on a blackboard. I will try to find it. If both concepts are part of the picture this has bearing on our range of questions.Best