Pages

Wednesday, February 13, 2019

When gravity breaks down

Einstein’s theory of general relativity is more than a hundred years old, but still it gives physicists headaches. Not only are Einstein’s equations hideously difficult to solve, they also clash with physicists other most-cherish achievement, quantum theory.

Problem is, particles have quantum properties. They can, for example, be in two places at once. These particles also have masses, and masses cause gravity. But since gravity does not have quantum properties, no one really knows what’s the gravitational pull of a particle in a quantum superposition. To solve this problem, physicists need a theory of quantum gravity. Or, since Einstein taught us that gravity is really curvature of space-time, physicists need a theory for the quantum properties of space and time.

It’s a hard problem, even for big-brained people like theoretical physicists. They have known since the 1930s that quantum gravity is necessary to bring order into the laws of nature, but 80 years on a solution isn’t anywhere in sight. The major obstacle on the way to progress is the lack of experimental guidance. The effects of quantum gravity are extremely weak and have never been measured, so physicists have only math to rely on. And it’s easy to get lost in math.

The reason it is difficult to obtain observational evidence for quantum gravity is that all presently possible experiments fall into two categories. Either we measure quantum effects – using small and light objects – or we measure gravitational effects – using large and heavy objects. In both cases, quantum gravitational effects are tiny. To see the effects of quantum gravity, you would really need a heavy object that has pronounced quantum properties, and that’s hard to come by.

Physicists do know a few naturally occurring situations where quantum gravity should be relevant. But it is not on short distances, though I often hear that. Non-quantized gravity really fails in situations where energy-densities become large and space-time curvature becomes strong. And let me be clear that what astrophysicists consider “strong” curvature is still “weak” curvature for those working on quantum gravity. In particular, the curvature at a black hole horizon is not remotely strong enough to give rise to noticeable quantum gravitational effects.

Curvature strong enough to cause general relativity to break down, we believe, exists only in the center of black holes and close by the big bang. In both cases the strongly compressed matter has a high density and a pronounced quantum behavior which should give rise to quantum gravitational effects. Unfortunately, we cannot look inside a black hole, and reconstructing what happened at the Big Bang from today’s observation can, with present measurement techniques, not reveal the quantum gravitational behavior.

The regime where quantum gravity becomes relevant should also be reached in particle collisions at extremely high center-of-mass energy. If you had a collider large enough – estimates say that with current technology it would be about the size of the Milky Way – you could focus enough energy into a small region of space to create strong enough curvature. But we are not going to build such a collider any time soon.

Besides strong space-time curvature, there is another case where quantum effects of gravity should become measureable that is often neglected: By creating quantum superpositions of heavy objects. This causes the approximation in which matter has quantum properties but gravity doesn’t (the “semi-classical limit”) to break down, revealing truly quantum effects of gravity. A fewexperimentalgroups are currently trying to reach the regime where they might become sensitive to such effects. They still have some orders of magnitude to go, so not quite there yet.

Why don’t physicists study this case closer? As always, it’s hard to say why scientists do one thing and not another. I can only guess it’s because from a theoretical perspective this case is not all that interesting.

I know I said that physicists don’t have a theory of quantum gravity, but that is only partly correct. Gravity can, and has been, quantized using the normal methods of quantization already in the 1960s by Feynman and DeWitt. However, the theory one obtains this way (“perturbative quantum gravity”), breaks down in exactly the strong curvature regime that physicists want to use it (“perturbatively non-renormalizable”). Therefore, this approach is today considered merely a low-energy approximation (“effective theory”) to the yet-to-be-found full theory of quantum gravity (“UV-completion”).

Past the 1960s, almost all research efforts in quantum gravity focused on developing that full theory. The best known approaches are string theory, loop quantum gravity, asymptotic safety, and causal dynamical triangulation. The above mentioned case with heavy objects in quantum superpositions, however, does not induce strong curvature and hence falls into the realm of the boring and supposedly well-understood theory from the 1960s. Ironically, for this reason there are almost no theoretical predictions for such an experiment from either of the major approaches to the full theory of quantum gravity.

Most people in the field presently think that perturbative quantum gravity must be the correct low-energy limit of any theory of quantum gravity. A minority, however, holds that this isn’t so, and members of this club usually quote one or both of the following reasons.

The first objection is philosophical. It does not conceptually make much sense to derive a supposedly more fundamental theory (quantum gravity) from a less fundamental one (non-quantum gravity) because by definition the derived theory is the less fundamental one. Indeed, the quantization procedure for Yang-Mills theories is a logical nightmare. You start with a non-quantum theory, make it more complicated to obtain another theory, though that is not strictly speaking a derivation, and if you then take the classical limit you get a theory that doesn’t have any good interpretation whatsoever. So why did you start from it to begin with it?

Well, the obvious answer is: We do it because it works, and we do it this way because of historical accident not because it makes a lot of sense. Nothing wrong with that for a pragmatist like me, but also not a compelling reason to insist that the same method should apply to gravity.

The second often-named argument against the perturbative quantization is that you do not get atomic physics by quantizing water either. So if you think that gravity is not a fundamental interaction but comes from the collective behavior of a large number of microscopic constituents (think “atoms of space-time”), then quantizing general relativity is simply the wrong thing to do.

Those who take this point of view that gravity is really a bulk-theory for some unknown microscopic constituents follow an approach called “emergent gravity”. It is supported by the (independent) observations of Jacobson, Padmanabhan, and Verlinde, that the laws of gravity can be rewritten so that they appear like thermodynamical laws. My opinion about this flip-flops between “most amazing insight ever” and “curious aside of little relevance,” sometimes several times a day.

Be that as it may, if you think that emergent gravity is the right approach to quantum gravity, then the question where gravity-as-we-know-and-like-it breaks down becomes complicated. It should still break down at high curvature, but there may be further situations where you could see departures from general relativity.

Erik Verlinde, for example, interprets dark matter and dark energy as relics of quantum gravity. If you believe this, we do already have evidence for quantum gravity! Others have suggested that if space-time is made of microscopic constituents, then it may have bulk-properties like viscosity, or result in effects normally associated with crystals like birefringence, or the dispersion of light.

In summary, the expectation that quantum effects of gravity should become relevant for strong space-time curvature is based on an uncontroversial extrapolation and pretty much everyone in the field agrees on it.* In certain approaches to quantum gravity, deviations from general relativity could also become relevant at long distances, low acceleration, or low energies. An often neglected possibility is to probe the effects of quantum gravity with quantum superpositions of heavy objects.

I hope to see experimental evidence for quantum gravity in my lifetime.

This question is really off-topic. Also, you can answer it by way of Wikipedia. The short story is that they were both developed independently, by what you could loosely call particle physicists, and it turned out later that string theory required supersymmetry for consistency.

What seems straightforward to a primitive like me reconciling GR’s observations in a simpler framework, the physical curvature of spacetime is problematic. Perhaps on a graph the magnitude of gravity’s influence is correctly plotted as a curvature however, physically I doubt that is exactly what’s going on in nature.

You often talk about physics need for strong foundational assumptions, which many other agree, yet just as in Einstein’s time with Newton, there is very little support for completely dismantling GR, taking what we’ve learned and incorporating it into a more empirical foundational framework, I think that holds more promise than anything else in tying more pieces of the puzzle together.

I recently stumbled over the theory of Casual Set Quantum Theory and its "prediction" of fluctuations of lambda and its stochastic evolution between positive and negative values giving rise to a small and positive value of CC as a interesting idea.

Is there any good reason to dismiss the posibility that QM is actualy a classical theory in disguise, as 't Hooft is trying to proove? What if instead of quantizing gravity one should *classicisize* QM?

As we discussed before, I believe, the problem is not so much to convert QM into a classical theory, but to do it in such a way that the resulting theory is still predictive (hopefully, more predictive than QM).

Once you convert QM to a classical theory you may be able to merge it with GR more easily. The resulting theory may be more predictive than either QM or GR, but of course I do not know that. My question is if you are aware of obvious problems against this path. Were there attempts that failed for some insurmountable reasons? Or is just nobody believes QM could be fundamentally classical so this hypothesis is left unexplored?

I have one more question that bothers me for some time. You say that superpositions of large objects should be studied. I am not sure that such superpositions are possible to prepare. My limited understanding is that the superpositions are limited by the uncertainty relations. You cannot be more uncertain than Heisenberg's principle requires (of course you can just refuse to look, or ignore the information that is available, but this uncertainty would not create a superposition). But, for example the uncertainty in the positions of,say, 1Kg masses is so small under any imaginable experimental setup that I just don't see a practical way of extracting experimental results from such a setup. One way I hear you could get such superpositions is by using a Schrodinger cat experiment, but such an experiment seems conceptually wrong to me. The uncertainty about, say, the position of the cat does not increase by placing the cat into a box. It is still given by the uncertainty relations, so it will be practically null. The position of the cat is, in my opinion, always exactly known, regardless of the use of a box.

't Hooft has said that his theory implies that quantum computers cannot factor large numbers. So if quantum computers are actually shown to work, this will be a good reason to dismiss 't Hooft's theories, unless you can tweak them so they don't actually make this prediction.

You cannot be less uncertain than Heisenberg's uncertainty principle requires (you can use squeezed states to reduce uncertainty in one dimension while increasing it in another, but this doesn't reduce the total uncertainty). However, you can have superpositions over much larger distances than one would expect from Heisenberg's uncertainty principle. It's still extremely difficult to measure the gravity attraction of an object in superposition, but as far as I know, there is no theoretical obstacle to doing this.

"Yes, by making good use of quantum features, it will be possible in principle,to build a computer vastly superior to conventional computers, but no, these will not be able to function better than a classical computer would do, if its memory sites would be scaled down to one per Planckian volume element (or,in view of the holographic principle, one memory site per Planckian surface element), and if its processing speed would increase accordingly, typically one operation per Planckian time unit of 10−43 seconds."

So, it is not enough to prove that quantum computers work (he accepts that), but also that they outperform a similarly sized Planckian classical computer.

Peter Shor: "you can have superpositions over much larger distances than one would expect from Heisenberg's uncertainty principle."

I do not see the distance as being a limitation imposed by the uncertainty relations. If an electron is prepared in a state close to a momentum eigenstate it would be in a superposition of positions over a large distance. But as the mass of the object increases the distance over which the superposition takes place decreases so for a macroscopic object becomes irrelevant.

Another problem as I see it is this: If you can measure the mass of an object in superposition it means you already know where it is (otherwise you could have the same effect from a larger, but more distant object), so it cannot actually be in superposition :).

" By creating quantum superpositions of heavy objects. This causes the approximation in which matter has quantum properties but gravity doesn’t (the “semi-classical limit”) to break down, revealing truly quantum effects of gravity. An often neglected possibility is to probe the effects of quantum gravity with quantum superpositions of heavy objects. "

what sort of predictions does different QG and emergent gravity provide should these experiments with the requisite sensitivity provide?

@ Neo and all, Roger Penrose in his Road to Reality discusses this. He does so within the context of his R-theory, but whether one considers that fundamental or as an effective theory or phenomenology Penrose's discussion is interesting. He talks about the superposition of masses and its result on the metric. He also proposes some experiments to examine this,

Penrose largely argues for his R-theory of gravitation induced wave collapse. This is in some ways similar to what Pullin has in the Montevideo interpretation. Penrose even proposes some possible experiments. You will have to read his book, which is really a nice overview of physics and math-physics. He has these little problems one can solve as well that are fun to work. Even the hard ones are really not that tough if you have a physics background.

It would be surprising if canonical quantum gravity for weak or linear fields were wrong. If we perform measurements of superposed positions of large mass or large S = Nħ, N >> 1, systems we will be testing in some ways this sort of weak quantum gravity. It will be complicated because of stress-energy source terms, but it is workable.

which is the standard way of thinking about it. The AdS/CFT correspondence is a similar equation

nonlocal gravity in AdS bulk = local conformal QFT on the boundary.

This points to a duality between local and nonlocal physics. Since gravitation is such a weak field it is strong only in the UV limit, such as near a singularity, and we might write the Einstein field equation as

UV quantum gravitation = IR field theory.

This is an interesting duality with the UV and IR limits of physics in a duality. I am tinkering with something that implies another duality with quantum entanglement.

We might think of the extremization of a geodesic length as a statement that the quantum state of the particle in motion is in some maximal entanglement with spacetime. Gravitational waves are generated by stress-energy sources that would be so entangled, and might then tell us that spacetime itself exists in some maximal entanglement with itself. Spacetime is then constructed from entanglement.

Then for a massive quantized object, massive enough to exhibit measurable gravity, the superposition of this object's configuration is an entanglement with the gravitational field. Tests of quantum gravitation would then be some test of nonlocality of some aspect of gravitation. This might be some form of Cavendish experiment, or a test of whether this superposed gravity field will optically split a photon into two paths.

Hi Sabine... You state that physicists "have known since the 1930s that quantum gravity is necessary to bring order into the laws of nature"... but is this true? What if the mechanism of force generation is associated with how the associated energy field is created? Do physicists specifically know how any energy field is created? Perhaps that knowledge may provide some insight into why Gravity does not fall into the same "mode" as the quantum associated forces.

My humorous thought is that the prospect of human experimentalists performing quantum gravity experiments is keeping the team responsible for running the universe simulator awake at night. The architects are struggling with the behaviors, the developers are struggling with the implementation, the operations people can't figure out where to get the necessary compute cycles, and the HR department wants to shut the whole thing down with an ever-expanding AdS bubble. ;-)

While that is an option that cannot be excluded, I do not know of a way to make it mathematically consistent. Physicists try to develop a theory of quantum gravity because such a theory is necessary to solve certain problems in the existing theories. Maybe there are other ways to do it than quantizing gravity. But I don't know one.

Happily I see the topic "quantum gravity" is back on the blog. I share the hope to see a major step here as long as I have my time here in earth. Personally it's the most exciting field in fundamental physics for me.

I graduated in physics at the end of the nineties but never worked in physics research afterwards. So as a first order approximation you can consider me as a layman.

Having said that, I'd like to ask a question: There are those conceptual ingredients and tools like string theory and the holographic principle. You also mention that with respect to Verlinde’s work. As far as I perceived this, many of that is based on AdS space. The universe we observe seems not to be compatible with an AdS space but is rather similar to a dS space.

So I wonder: Even if theoretical explorations in AdS space discover elements that seem somehow similar to things we observe - is it likely that this can finally lead into a viable model for our universe? Or are AdS space and dS space all too different?

I understood that many things are more easy in AdS space and hence I understand that it is just natural to have a tendency to explore theories in AdS space. But can you really get to something real in there at the end?

What of the properties of and "discoveries" in AdS space can be - in some form - be transferred to the dS space world?

This has been something I have pondered over a lot. It does appear that quantum gravity prefers the AdS spacetime with the negative curvature Λ < 0. This is certainly the case for AdS/CFT correspondence. Supersymmetry is broken with positive energy as well. The observable universe is de Sitter-like with Λ > 0 and putting string theory here and potentially much of quantum gravity is maybe pounding a square peg into a round hole.

I though have this idea that maybe the AdS_5 has holographic screens defined by 4-dimensional regions that bound causal wedges. One can think of the dS spacetime as a single hyperboloid sheets bounding a cone, while the AdS is one of a pair of hyperboloids within the top and bottom part of the cone. These do meet at I^± (scri^±). The AdS_4 and dS_4 might then exist on these holographic screens have junction conditions that are positive or negative. From a quantum perspective these are two states determined in a way similar to the Haldane chain for quantum critical points.

I think this would force quantum gravity on a dS_4 to be a form of asymptotic safe gravity. The stringy aspect of the AdS_5 might then be mapped into a different form of gravitation from the AdS_5 bulk. The duality between AdS_5 and the CFT_4 on the boundary would then mean the physics on these junction regions might carry CFT_4 physics. Quantum gravitation would then be gauge-like or asymptotic.

Thanks for your elaboration on this. So that would mean that it’s maybe not a question of „dS or AdS“ but it could be rather both „tied together“ in a very unique way. I wonder what testable predictions would be possible to check for that.

For some reason I was looking here and found these old replies. I could go into some depth on how dS and AdS are related to each other. I think it is similar to the momentum cones in the Haldane chain and symmetry protected topological states. Of course I would be theory mongering if I went into great depth. This is a part of my solid state physics view of some things.

I fear situations where theory races far ahead of experimental evidence. If it comes up with a mathematically coherent theory, that may become completely institutionalised before any experimental evidence is available to refute it.

Imagine if Newton (or even Einstein) had known about the large deviation from the inverse square law, generally referred to as 'dark matter'.

By way of contrast, think of quantum theory. This was developed in an environment where a great deal of relevant experimental data - such as the energy levels of atoms, the shapes of molecules, etc was becoming available.

>The major obstacle on the way to progress is the lack of experimental guidance.

Let’s assume (just for practical reasons – as The Milky Way Galactic Collider did not confirmed QG and there are some unclear budget and technology issues with The Local Group of Galaxies Collider) that there is nothing experimentally interesting at the Planck scale – in other words that behind the Big Desert is Even Bigger Desert. (Here I’m aware of possible oversimplification – maybe there are hidden answers to all Open Questions of Cosmology there.) Is this end of HEP? Or shall we reduce/stop BSM fantasies and look again and much more carefully at the SM (inconsistences and unanswered questions) and try to find gravity within it? I mean it as a task for theorists to avoid misunderstandings.

>Others have suggested that if space-time is made of microscopic constituents, then it may have bulk-properties like viscosity, …

"another case where quantum effects of gravity should become measureable that is often neglected: By creating quantum superpositions of heavy objects"

Doesn’t nature already do this (herself)? I’m thinking about solar flares – a natural experiment. Whenever they happen, many particles are ejected from the otherwise rather ‘steady’ sun very suddenly. The chemical composition of the sun is very well known and in any case, it can be measured ‘what stuff (particle type) this was, how much, and how energetic’ because the earth is bombarded with it regularly, whenever the flare is aligned rightly – or maybe I should say wrongly because of the hazards of solar flares. But in any case, whenever a solar flare happens the EM spectrum also changes and this is already being measured with great precision. That means more photons flying around at different energies and if I understand QM correctly, whenever people speak of a thing being observed and it’s wavefunction changing (or collapsing), that actually happens because it interacts (for instance with a photon) and not because a mouse or a man were looking at it or thinking about it. To wrap up, LIGO would also have to be pointed at the flare, to figure out what happened there gravitationally.So basically, you would be studying a (large, heavy) beam of particles of which everyone of them is hit (on average) by more photons, changing the wavefunction (on average), and you would be hoping for noticeable gravitational effects. Proposal for an experiment!

So while black holes and Milky Way size particle colliders are out of reach, quantum superposition of heavy objects may be a way to test quantum gravity.

Wouldn't these experiments also test the Penrose interpretation, that the cause of "wave function collapse" is sufficiently high mass.It looks like he guesses that one plank mass is enough to stop superposition, a mass of 20 micrograms. Sounds doable with some effort.Though I guess these small experiments may not find anything surprising and we will have to wait till we somehow test the high energy regime. But it sounds like we should at least try.

Animated discussions on FCC aside... let me ask you a question, slightly off-topic, but doesn't need a long answer:

Talking of gravity, what is your take on the measurements with anti-matter trying to determine whether anti-matter "goes up" in the gravitational field of the earth (e.g. Gbar and AEGIS at CERN), i.e. the possibility that ours is a Dirac-Milne universe?The little I've read on this, it would explain many things (not necessarily related to quantum gravity, of course).

Please avoid off-topic discussions. My take on those measurements is the same as that of pretty much every theorist: Not well-motivated theoretically. Basically there's no consistent theory to make it work. But since it's not a hugely expensive experiment, by all means, let those experimentalists have some fun.

The Milky Way sized particle collider estimation is for linear collider technology, right? If there was a way to add center-of-mass energy per unit distance much much faster, perhaps we will someday find a way to probe these energies? Not today or tomorrow, of course.

The Galactic Council of CERN thinks the Outer Rim Collider (ORC) is justified!

I admire your writing style, I hope to be able to write like you one day.Perhaps we should accept the idea that a duality of points of view is something fundamental. Perhaps a particle has to follow all the paths that (summed or integrated) allow it to preserve the moment of quadrupole, in so far as it has no mass/energy to yield to space. Or an electron has to be scattered on its orbital, because it has no energy subunits to be yielded into gravitational waves (as the moon is doing e.g.).Maybe this is the boundary between the two phenomena: gravity would be a differentiable form of interchange with the fabric of space, but whenever that is not possible, or only partially possible, behaviors become non-differentiable, fractal, probabilistic, in order to circumvent that fabric.If so, maybe nobody will ever measure a quantum gravity effect.

Your comments about doing experiments with quantum superpositions of heavy objects -- for which most physicists think they already know what we'll find -- bring to mind an analogy with debugging a program. Sometimes you reach a point where you've checked all the obvious failure points, found nothing, and the buggy results seem inexplicable. At this point you just have to systematically test all your assumptions about the program, even the ones you are certain are true, because SOMETHING you're assuming must be false. Something similar may be necessary to reconcile QM and GR.

The LIGO test masses are each 40 kg, and the quantum oscillation of these masses forms part of the limiting sensitivity of the detectors. Is there some experiment that can make use of the tremendous sensitivity of these detectors that will be effected by quantum gravity?

Well, strictly speaking all gravity is quantum gravity. The question is can you use the measurements to demonstrate that gravity is quantized. For the LIGO test masses the answer is no. That is because if you want to show that gravity is quantized you need to measure the gravitational field. LIGO does not measure the gravitational field of the test masses.

Yes, that's right, it's a conjecture. Alas, I was replying to a question that asked how to test quantum gravity, hence presupposing it does exist already. My point is simply if quantum gravity exists then strictly speaking all gravity is quantum gravity, it's just that sometimes the effects are so weak you can't tell. Is like strictly speaking all electrodynamics is quantumelectrodynamics, just that in many cases you'll not notice the "quantum" aspect of it.

"New Map of Dark Matter Spanning 10 Million Galaxies Hints at a Flaw in Our Physics" by Michelle Starr on 14 FEB 2019https://www.sciencealert.com/new-map-of-dark-matter-shows-something-could-be-wrong-with-the-standard-model

"an international team of astronomers has used one of the world's most powerful telescopes to analyse that effect across 10 million galaxies in the context of Einstein's general relativity. ... The most comprehensive map of dark matter across the history of the Universe to date.

"It has yet to complete peer-review, but the map has suggested something unexpected - that dark matter structures might be evolving more slowly than previously predicted.

"If further data shows we're definitely right, then it suggests something is missing from our current understanding of the Standard Model and the general theory of relativity," said physicist Chiaki Hikage of the Kavli Institute for the Physics and Mathematics of the Universe."

If 't Hooft’s recent arguments for the existence of a mathematically self-consistent momentum wave interpretation of all particles entering/exiting an event horizon are valid, wouldn’t this suggest that the end state of gravity is not a singularity, but something more akin to a spherically symmetric Fermi space in which all particles, not just electrons, form spatially distributed bands? This is relevant to the question of quantum gravity because it suggests that GR needs to be transplanted into a more symmetric, Wigner-space like framework. Open flat space (space-time dominated) and isolated black holes (momentum-energy dominated) would then emerge as two extrema of this gentler but more complete GR model. A Fermi sea variant of momentum space, located just above the absolute event horizon, would act as a sort of cosmic safety net that makes Kruskal-Szekeres coordinates and quantum gravity irrelevant to the day-to-day physics of even the most extreme regions of the universe. This GR-in-Wigner concept would be an emergent gravity model, since ordinary and momentum space would both become extrema in relationships between large ensembles of fermions and bosons.

I hope that this recent article by Christian Wuthrich (about routes to QG that originate from GR, addressed mainly to those interested in philosophy of physics) can be considered as relevant and interesting:https://arxiv.org/abs/1902.02099

In his earlier article about BH thermodynamics and information-theoretic approaches towards BH physics he writes that:

,, the Bekenstein-Hawking formula for black hole entropy is widely accepted as 'empirical data' in notoriously empirically deprived quantum gravity''and ,,regardless of which programme in quantum gravity they adhere to, physicists of all stripes take it as established that black holes are thermodynamic objects with temperature and entropy.''https://arxiv.org/abs/1708.05631

Is it really so that it is considered as a ,,litmus test for quantum gravity''?

(I know it is on the verge of being a bit off-topic, but I wander what do you think about such ideas: https://doi.org/10.1088/1742-6596/880/1/012014)

Mrs Hossenfelder never considers well-founded theories that start with an axiomatic foundation. If deduction to more complicated levels of structure and behavior is performed with proper mathematical methods, the headaches are largely prevented. However, it requires sufficient mathematical skill. That is what most physicists fail. Mathematics tends to guide and restrict the extension of well-selected foundations.

This is probably a tangent, but: I'm curious why you put it that way. Is it because you really think that is precisely true, or is it because you were writing an article for the general public and so are phrasing it in terms they can easily relate to?

As a thought experiment, consider two systems of particles. They are identical systems except in one respect: In system A the particles are all entangled with one another, and in system B none of them are. Do A,B have different weights?

Andrew, picture the wave functions for the two cases. Entangled properties will have large, highly delocalized wave functions, while non-entangled (observed) properties will have more localized and thus "sharper" wave functions. Since sharper means more high-momentum components, the more localized wavefunctions will have a bit more energy. Thus if all other parameters at the same, the more observed, less entangled system will have a tiny bit more gravitational pull.

Another way of saying this is to use Richard Feynman’s observation that a property remains quantum only if there exists no observation of it anywhere in the universe. Since observation creates information, which is a form of heat and energy, the more observed and less entangled system will always contain a bit more heat than its less observed, more entangled equivalent.

It's little strange E. Verlinde propose entanglement being a foundation for the emergent gravity. We can see interactions (observations) are momentum and energy in matter structures and hence foundation for gravitational curvature of spacetime. Maybe entanglement has correlation with gravitational changes and decoherence, maybe with Higgs mechanism too...

Eusa, imagine an electronic fund of one million Euros that can be dynamically distributed and redistributed among millions of people, with one restriction: No matter how complex and rapid those dynamic redistributions of the money become, the total sum must always remain precisely one million Euros. To do this, you will need some non-trivial universal rules and infrastructure, since otherwise you could get into situations where dynamic movements of funds lead to transitory situations there the sum rises above or below one million. These rules may get particularly odd and draconian for very small transfers, since more of them could occur at roughly the same time.

Matter and energy have this same problem with regard to absolute conservation of quantities such as mass-energy, charge, and angular momentum. Matter and energy must also have non-trivial universal rules and infrastructure to ensure multiple forms of absolute conservation. In emergent space theories, these rules lead to behaviors that at classical scales look a lot like what we call space. At the smaller, micro-exchange level these absolute conservation rules can become a lot more squirrelly (technical term), and we call this much less intuitive set of behaviors quantum entanglement.

Verlinde added to this emergent space framework the idea that gravity itself is part and parcel of this scheme, an emergent force that has no simple equivalence to any of charge-mediated forces of quantum mechanics.

The actual derivations of both emergent ("holographic") space and emergent gravity were intensely mathematical and very abstract in comparison to the analogy I just gave, and bring in issues such as the overall configuration of space and time. But that does not change the incentives that lurk underneath the mathematics, in particular the profound focus that the universe seems to have on maintaining its absolute conservation laws.

No problem with Verlinde's math - I have the same, but in my interpretation holografic principle emerges from the center/cause of an element particle field, not out of entanglement correlations. I see the information theory connects discrete inertia with continuous gravity just via that math. This is very compatible with energy tensor and principles in general relativity.

Entanglement could be a parity of interaction distances being conserved as the property of the spacetime. And if little spaculation permissible it can be possible that "spacetime automata" are under creation and annihilation by field changes perturbations manifesting dark energy and dark matter...

Your points are nicely stated and interesting to contemplate. Robert Spekkens at PI has written persuasively about how quantum mechanics in particular enables a wide range of trade-offs between abstractions of state (kinematics) and evolution of states (dynamics), leading to seemingly very different theories that nonetheless end up making the same experimental predictions. What we personally choose to put at the top of our principles-of-greatest-importance list guides our analytical approaches in powerful ways, such as Einstein's focus on the equivalence principle, or (less known) John Bell's defiant use of the often-maligned pilot wave model to sharpen his mind in just the right way to derive his famous inequality. I freely admit to finding the principle of absolute conservation to be a Muse through which I like to peer, one that I find personally enlightening in often unexpected ways. I am a-mused by it, to use the literal derivation of that word, but only as one of many possible intriguing priorities for deriving and organizing analytical approaches.

Where what is calculated? Also, it would help (a) if you would address the person you are trying to communicate with and (b) if you could please chose a pseudonym other than "Unknown" because there are dozens of Unknowns in this comment section. Even a number would help.

Regarding: “We might think of the extremization of a geodesic length as a statement that the quantum state of the particle in motion is in some maximal entanglement with spacetime.”

I get the impression that particle physics and the people who practice within it for a living discount entanglement as a major part of what a particle is and can become. For example, revolutionary ideas have emerged from the study of superconductivity, such as the generation of a photon mass in a superconductor, were later extended from condensed natter to other fields of physics, famously serving as paradigms to explain the generation of a Higgs mass of the electroweak W and Z gauge bosons in particle physics. Maybe the reductionist methods of particle physics are not the proper tool to answer the questions that could arise from the changes that fundamental particles undergo when they are exposed and connected to other levels of nested complex interactions with other physical processes.

System A will have all the particles in it ensemble stabilized at their lowest energy state or ground state to meet the entanglement requirement. Whereas System B be will have all particles with energies above the ground state with all energy states different from any other particle. This is assumed to insure that all particles in System B's are not entangled.

This means system B will have markedly more energy dispersed throughout he ensemble of particles than will system A with equal particle count.

System B is therefore more energetic and consequentially produce more gravitational attraction with all else being equal.

I just read that a chemist has come up with a way to produce tri-hydrogen H3+ in large amounts. Why can't LHC use H3+ as feed-stock as opposed to protons. Using tri-hydrogen H3+ will increase the energy available at collision at least 6 times over protons.

Xyz space has of course a curious conjugate called momentum space, call it abc, that is accessible via Fourier transform. Quantum wave functions can be expressed as easily in abc as in xyz. Abc is also the basis for electron bands, making it a foundation of solid-state physics.

However, physicists do not usually think of abc as fundamental in the same sense as xyz. Unlike universal xyz, instances of abc are limited in size and occur only in islands of condensed matter, where their properties depend on the nature, number, and configuration of electrons and atoms. Abc can be interpreted simply as Pauli exclusion raising delocalized electrons to more energetic momentum states, with those states then represented mathematically as distances in abc space. This energy cost also keeps abc islands small and densely occupied.

Gentle Trio provides an exceptionally simple explanation for why condensed matter sometimes provides high-energy insights: Like abc space, xyz space is nothing more than the result of Pauli exclusion across a large ensemble of fermions. That is, in Gentle Trio fermions are more fundamental than space, and can organize under Pauli exclusion in two distinct ways: xyz, which gives universe-spanning flat space, and abc, which gives both solid-state momentum spaces and black hole event horizons ("holographic" Fourier projections of matter). To borrow a software analogy, xyz is an "open source" space where adding distance is free (no energy cost), while abc is a "proprietary" space where adding distance comes with a high energy cost. These energy economics make xyz the universal "Internet" of particle interactions, while abc ends up localized and broken into innumerable "private apps".

Notably, this would make momentum space an underutilized research platform for deep research into the nature of spacetime.

As I noted in the post that you have responded to, there is a loophole in that prescription for abc space that you describe and that is entanglement. Entanglement with all sorts of bosons enable large ensembles of fermions to become free of Pauli exclusion and become just like xyz in "open source" space where adding distance is free (no energy cost). And with boson entanglement abc is equally capable as xyz to maintain existence in universe-spanning flat space which gives both solid-state momentum spaces and black hole event horizons ("holographic" Fourier projections of matter). These quasiparticles become free and independent of the processes that created them and transform their energy economics to make abc equal to xyz in the universal "Internet" of particle interactions. Yes, I have seen the evidence of this where such constructs can become as independent, self-sustaining and long lived as any particle created at LHC.

Your comments are interesting but difficult to assess without more details on the nature of your quasiparticles. Are you proposing some form of dark matter?

I certainly agree that multi-particle entanglement, particularly between fermions and bosons, is a critical component of taking the big leap from simple Pauli exclusion to emergence of the full properties of both xyz and abc space. Pauli exclusion is just where the process begins, not where it ends. I intentionally made no attempt to address that complex issue in my already over-packed comment.

I'm not at all adverse to the idea that abc could be just as universal as xyz, but obviously (as you seem to be proposing) you would need some additional mechanism to link the isolated abc islands. I've wondered more than once whether some currently unknown aspect of entanglement could explain the odd correlations between large black holes and the structure of galaxies, which are inexplicable by simple gravitational attraction. Perhaps a universal abc could help by enabling deeper non-gravity connections between black hole event horizons and condensed matter thousands of light-years away.

Incidentally, discarding the concept of any kind of black hole interior (one of the three principles of Gentle Trio Unification, GTU) enables new theoretical approaches to questions such as how polar jets work, or why quasars seem to have existed only in the early universe. The firewall _becomes_ the black hole, with no loss of information. Infalling matter would likely break down into band-like fermion states. These would be well-hidden but potentially enduring and complex, and notably they might support magnetic fields (!). Would protons and neutrons remain stable in these delocalized bands, or would they quickly be crushed down into quark soup, and eventually (with some conservation violations) into photons? Such questions could lead to diverse black hole types, in sharp contrast to the traditional minimal-state black hole model.

Perhaps this is too simplistic, but if all current descriptions of space time (such as GR) boil down to equations in xyzt and xyzt are presumed to be Real Numbers, then we are automatically making the assumption that space time is infinitely divisible. Which cannot be true. So any "differential equation" approach to space time is fundamentally wrong at some level.

The Mandelbrot set is famous for its limitless fractal beauty, since no matter how detailed your last iteration of it was, you can always reuse it as the seed for a new one to reveal still more complexity. In contrast to the limitless details of the set, the defining equation for Mandelbrot is very compact: fc(z)=z^2+c. You iterate this equation from z=0, and if the resulting values do not diverge to infinity, then that complex number c is part of the Mandelbrot set.

Imagine that deep in the Mandelbrot set there exist sentient mathematical entities that, like us, are busy pondering the nature of their universe. One of their great mathematical philosophers, Platobrot, long ago postulated that reality is composed from some set of infinitely tiny but perfect objects. Mandelbrotians thus define their real mathematics in terms of infinite sums of these infinitesimal perfect entities. Since the infinities cancel this works after a fashion, but it is computationally costly and leads to all sorts of singularities and paradoxes.

Like that of the Mandelbrotians, much of our math and physics is written upside down. In both math and physics it is the compact rules and processes for creating details that are the most fundamental, not the unobtainable limits (e.g. "point particles") of those rules ("PAVIS", in another draft). Thus if the quantum mechanics of spacetime comes into existence only when the rules of physics are "funded" by the presence of mass-energy (GTU pillar 3), then our formalisms need to include this simplifying constraint explicitly. It means that the flat-space gulfs between galaxy superclusters are full not of seething virtual particle pairs at the Planck foam level, but of something much simpler: nothing at all.

Examples of the jaw-dropping increases in computation efficiency possible when more complete (Wigner space) quantum models are calculated top-down can be found in a series of papers by Jean Michel Sellier, now at MILA in Montreal doing AI.

Comment moderation on this blog is turned on. Submitted comments will only appear after manual approval, which can take up to 24 hours. Comments posted as "Unknown" go straight to junk. You may have to click on the orange-white blogger icon next to your name to change to a different account.