Reuter explain MAGIC?

Reuter makes quantum gravity renormalizable by letting the action run to a fixed point as the proximity k --> infty.

so he gets a whole series of metrics gk

these metrics probe spacetime at a finer and finer scale. k is like the reciprocal of the characteristic length of an interaction---a kind of 1/r where r is an interaction distance or wavelength typical of the process studied.

or if that is too simple, k is monotonically related to such a characteristic 1/r

The reason Reuter gets classical relativity out of this is that he has a trajectory where for a long long range of k the action stays the classical action and the metric gk stays the same. Only when k gets very large does gk leave off being classical and start heading for the fixed point.

So suppose you have two galaxies A and B and one is sending photons to the other.
THE DISTANCE between A and B depends on what metric gk you use to measure. they are approximately classical but slightly different. What metric is appropriate to use depends on the WAVELENGTH of the photon that is doing the traveling.

this is the scale at which it interacts with the (possibly quantum-bumpy) geometry of the space between the galaxies.

It is possible that in Reuter's picture the more energetic photon sees a bumpier road from A to B and actually a LONGER DISTANCE that it must travel.

So even if they both nominally travel at speed c, the more energetic one might get there a little bit later.

Any thoughts? Would anyone like some links to Martin Reuter papers?
I am following through on the MAGIC report of a 4 minute delay of some TeV photons after a half-billion year trip.
We can dismiss the MAGIC report as indicating nothing more than some unknown effect at the source, and perhaps observational error, or we can consider possible explanations assuming that their finding might turn out to be significant.

p.2 One might guess that the QG scales
MQG1,QG2 ˆMP, where ˆMP = 2.4 × 1018 GeV is the reduced
Planck mass, but smaller values might be possible
in some string theories [3], or models with large extra
dimensions [5]

I already started a thread .... nobody wants to comment
(Can spinfoam/LQG use a G at particle size?)

I don't see how that would make sense. Lorentz invariance is an exact local symmetry in both classical and quantum field theory of gravity, term by term you will always see it. The RG flow does nothing to change this, rather in his scheme it just plays with the running couplings like say newtons constant.

To get a modified dispersion relationship consistent with this experiment, that local symmetry has to be violated explicitly (exceedingly weakly) somewhere around the Planck scale. Worse, running things the opposite way, with matter interactions, unless a new symmetry protects you, under the renormalization group you will generate new marginal and irrelevant interaction operators that have no natural reason to be small at weak coupling (unless you finetune through many orders of magnitude, or find a dynamic mechanism to suppress them). This is why we already have extremely powerful bounds on LI breaking, just from regular everyday particle accelerators and why this whole business of looking for cosmological LI breaking is highly speculative and fraught with theoretical danger.

So whatever nonperturbative effects the IRG may or may not lead too, it would have to output explicit LI breaking terms as well as creating a new larger symmetry of nature to protect LI violation from being too harsh so as not to be ruled out instantly. A tightrope that is probably inconsistent from the getgo, no known field theory has this sort of nonperturbative miracle.

So, I say with some degree of confidence, that nonperturbative QG (with the EH action) if it exists (like Reuter claims), is exactly lorentz invariant and I see this as a positive for the theory and not a negative..

edit- Actually, now that I think of it, its far far worse than that. It would lead to a gauge anomaly, you would have graviton states with timelike polarizations in the effective action around the planck scale, eg ridiculous

edit- Actually, now that I think of it, its far far worse than that. It would lead to a gauge anomaly, you would have graviton states with timelike polarizations in the effective action around the planck scale, eg ridiculous

Now gravitons? If the Higgs boson conveys mass to particles and gravitons mediate the gravitational interactions between them, how do you explain the fact that matter in the high-redshift universe appears to act gravitationally just like in the low-redshift universe? Do you have a model that explains how the Higgs bosons and gravitons and their respective fields have remained so remarkably congruent over >13Gy? I'd love to see that.

Einstein said that gravitation, inertial effects, and EM propagation through space could be explained by modeling space as a dynamic medium, the properties of which could be moderated by the presence of embedded matter. So far, there's not a better game in town, despite the fact that he laid this out in 1920.

Each of Reuter's metrics gk enjoys full local Lorentz invariance. As usual. That takes care of the needs of Quantum Field Theory etc. QFT is Reuter's specialty. Look up his papers.

Global solutions to GR do not have Poincare symmetry, as a rule, and Lorentz invariance does not apply over long distances.

So everything is fine and there is no contradiction. If you are going to talk about it, then you need to read the essential papers by Reuter on QEG. Otherwise it will seem that you do not know what you are talking about.
=====================

Hi Turbo, I think you may be making the same point I am, but from a different direction----the need to distinguish between local symmetry that field theories depend on---or at least approximate local symmetry---and the global spacetime manifold.

Marcus, I do not understand the content of your objection, it makes zero sense.

You say, "Each of Reuter's metrics gk enjoys full local Lorentz invariance. "

Which is *exactly* what I just said.

Otoh if Magic is right, and there is a nontrivial QG effect, this *cannot* be the case. Modified dispersion relationships imply *local* poincare invariance breaking (eg: schemes like time varying speed of light, dsr, and so forth). There is no such thing as global poincare invariance, its no longer the isometry group of spacetime *boggle*, and no one is talking about that.

Now gravitons? If the Higgs boson conveys mass to particles and gravitons mediate the gravitational interactions between them, how do you explain the fact that matter in the high-redshift universe appears to act gravitationally just like in the low-redshift universe? Do you have a model that explains how the Higgs bosons and gravitons and their respective fields have remained so remarkably congruent over >13Gy? I'd love to see that.

Turbo, there is no contradiction, nor do they really have anything to do with one another. Gravity is associated with interactions that are strong on the order of the Planck scale, the Higgs boson is one component of the vacuum excitation of the Higgs field during electroweak symmetry breaking. These are completely different physical regimes by roughly *16* orders of magnitude.

Now on scales that we are familiar with: like planetary, galactic and cosmological scales. Gravity completely swamps any and all physical forces. It really could care less about the details of the electroweak force or the details of the strong force (which accounts for 90% of the mass of ordinary matter, eg that which is stored in the interaction between quarks and gluons).

So yes, its absolutely predicted and expected that high z or low z universes will still be completely governed by general relativity, which as I fully agree is the correct macroscopic theory that we should be using, at least until the quantum regime of the early universe (~10^-20seconds)

No. You arent getting it.
for each photon, pick a k
that photon sees a smooth manifold with metric gk
there is no local invariance breaking in the manifold with that metric.
so what you say does not follow

I just got interested now b/c im curious to see how he gets rid of the pathologies this obviously will or could create. The only known examples of theories that are reduced like that are holographic in nature.

Maybe i'll write a post detailing a subtle caveat I ommited from my first post, on exactly how LI can be broken by a quantum field theory of gravity (though unlikely), even though its effective action and perturbation series respects the symmetry fully.

Marcus, I have this feeling we are not arguing about the same thing at all.

I keep insisting that I am specifically *NOT* talking about GR, or quantum gravity as a field theory like what Reuter does. These are (minus the possibility I may or may not mention) fully and totally lorentz invariant. Good!

When I talk about modified dispersion relationships, I am talking about theories that were discussed and linked by the MAGIC paper. Like time varying speed of light theories. These *explicitly* break lorentz invariance by design, and they are what the authors of that paper claim they have some evidence for.

So yes, I interpreted your OPost as wondering whether or not a quantum field theory of gravity (like what Reuter uses) can be made to break poincare invariance in such a way as to fit the Magic observation. I claim it cannot without breaking constraints.

I think a lot of people don't "get" Reuter's idea of spacetime.
It is not a smooth manifold that most people are trained to deal with.
Reuter's continuum does not have a welldefined dimensionality----it is fractal-like with a dimension around 2 at very small scale and macroscopic 4D.

He says this explicitly in various papers---I cant take time now to fetch links.

the point is that there is an infinite series of metrics g_k and for each k there is a smooth 4D manifold. The distance between galaxy A and galaxy B can be different for different k.

All the particle physics and QFT that you expect works fine because it works for each metric g_k on each manifold. There is no local break in Lorentz invariance. Most of the g_k look classical and pretty much the same anyway.

However two photons with different energies "see" slightly different 4D manifolds---because different k is appropriate for different photon energy.

It is almost impossible to detect such differences because the g_k are so close. Look at the Renormalization Group flow which Reuter and Saueressig plotted. And look at the trajectory in that flow which fits measured G and Lambda. You will see.
Only for very very long trips, or for very high energy will you be able to tell the difference.

When I say "you" I mean any interested reader. Some will not be interested in understanding or will not be able to get their heads around it.

the Reuter spacetime continuum, which has variable dimensionality at different scales, and a microscopic 2D fractal structure, is the LIMIT as k --> infty of this sequence of smooth 4D manifolds. It is a case where you have an unsmooth limit of smooth objects.

you may think it is strange to have a spacetime that is not a smooth 4D manifold, and it is very strange. However it leads to easy calculation and makes a number of remarkable PREDICTIONS (like 60 efolds inflation without exotic matter, like entropy of the CMB).

It also gets good words from people like Jan Ambjorn of Simplicial QG and Alain Connes of Noncommutative Geometry. And it is attracting young researchers. So it has a lot going for it. that makes it worth overlooking the fact that it is a totally new mathematical object from the diffy manifolds we are used to.

whatever you expect to do you can still do on each g_k, and then go to the limit

-Ok- I see what you are getting after, we are using language in different contexts.

Fix a k, then look at k + epsilon (or k + 1, whatever the convention). Classically (hbar -->0) there must be a smooth, differentiable map and a priviledged connection that links the two manifolds or else you don't have GR. Good.

I suppose the claim is that in the quantum case, the map is no longer smooth and differentiable. But then, look at the composition of the two *near* manifolds, taken as a new single entity; I claim that topologically it *must* possess a new local structure of some sort, and that new structure no longer respects poincare invariance exactly. An effective phenemonological poincare breaking if you will.

The leap of faith thats required though, is why that new single entity no longer respects the usual local laws of physics. eg Why can't we just use that to do quantum field theory with. Who cares about the global behaviour, it could already be a complicated mess classically.

I don't even want to think of the continuum limit, so illdefined is taking the union of topologically different entities ;/

"We show that the running of gravitational couplings, together with a suitable identification of the renormalization group scale can give rise to modified dispersion relations for massive particles. This result seems to be compatible with both the frameworks of effective field theory with Lorentz invariance violation and deformed special relativity. The phenomenological consequences depend on which of the frameworks is assumed. We discuss the nature and strength of the available constraints for both cases and show that in the case of Lorentz invariance violation, the theory would be strongly constrained."

They only study certain cases (massive particles) but the central idea is the same as that in this thread----running coupling can lead to slight delay in arrival time.

They point out that in terms of phenomenology---WHAT ONE SEES---the effect of running coupling could be compatible both with DEFORMED BUT NOT BROKEN Lorentz, and with outright violated Lorentz. So they consider two subcases and they find that the outright broken instance would be "strongly constrained". Maybe they think the deformed-but-not-broken version is a bit more likely to be workable.

But as I understand, deformed Lorentz is not really the issue either, because Reuter spacetime is a limit of smooth 4D manifolds each with its conventional Lorentzian metric gk
Each metric is the usual sort so there are no problems with quantum field theory blah blah and so on. It is just that as you get to very very high energy photons you notice that different energy photons "feel" different metric gk so the distances they must travel are slightly different.
If you look at the local situation, everybody still travels the same speed.

It will be fun for some mathematician to make Reuter spacetime physics RIGOROUS as a "projective limit" of standard spacetime physics as the cutoff k --> infty. That is what mathematicians do so well---they make rigorous, with careful definitions, what the physicists discover is happening. Things do seem to be advancing here!

Yes b/c deformed special relativity has a larger symmetry group protecting the marginal operators from percolating into the infrared in an uncontrolled fashion. Thats a good thing as I've mentioned in the past.

Explicit violation by contrast is heavily constrained and disfavored, even though thats the type of signal Magic tends to favor if you believe them.