That Photon from GRB090510

which reports the detection of an high energetic photon of 31 GeV within the first second of onset of the Gamma Ray Burst (GRB) 090510. As I explained in earlier posts (here, here and here), some scenarios with Lorentz Invariance Violation (LIV) motivated by quantum gravity do predict an energy dependence in the travel time of photons. Over such long distances as ours to the origin of the GRB, tiny delays add up and can upon arrival of the photon in our detectors result in an energy-dependent modification of the signal. The case that has been favored in the last years is that photons with high energies would arrive later than the lower energetic ones even when emitted simultaneously.

Previously reported cases indeed indicated high energetic photons arrive with a measurable delay. The status of the constraints on LIV derived from these events was nicely summarized in Lee and Giovanni's recent paper. One has to keep in mind though, without a precise knowledge of the emitting source it is hard to tell whether a measured effect happened during propagation. Besides this, the LIV modification in the propagation of photons is only measurable if it is a first order effect (of order Energy/Planck mass). If it is quadratic or higher, the effect would be too small to affect the spectrum of GRBs.

To let you know what my stakes are on the matter, I don't find it plausible quantum gravity would affect the propagation of photons from GRBs. I wrote a paper on that some while ago that explains very clearly why. Without going to much into the details, whether or not quantum gravitational effects are relevant depends on the energy density, not the energy. The energy density even of the GRB itself, and certainly that of the traveling photons, is many orders of magnitude too small to cause any measurable effect. It adds to this that models with violations of Lorentz invariance do either break Lorentz invariance, on which there are strong constraints from many experiments already, or they "deform" Lorentz invariance (known as "Deformed Special Relativity"), which causes conceptual difficulties that are so far unresolved. For these reasons, I am not very convinced such a quantum gravitationally induced energy-dependence of the speed of light would be observable in GRBs. It is however an interesting and testable scenario.

In any case, the measurements of GRB090510 reported in the above mentioned paper do clearly not support the energy-dependence of the speed of light. In the paper, they derive a bound on the Planck scale from their measurements that is some orders of magnitude higher than we expect the Planck scale to be. This means if there was an effect it is much smaller than it should have been. However, it gives me the creeps if people draw conclusions from single photons. As also Lee and Giovanni pointed out explicitly in their paper, the propagation could have a stochastic component since it's a modification caused by a quantum gravitational effects of the background. In that case, only more statistic could allow conclusions. It also remains to be explained what caused the delay in the other measured gamma ray bursts.

I support your idea that gravity does not affect (at least on the stage physicists are now) single photons. The difference between the magnitude of gravity and electromagnetism is too big, as far as I understand your posting.

Thanks. I had seen the paper, but not the discussion at PF. It does indeed bring up the essential points. Just two clarifications

- That the violation be statistical does not refer to the chance of the photon be a background photon (which is contained in the experimental error), but to the possibility that the qg effect has a statistical component. This means not every high energetic photon will arrive with the same delay, and only better statistics will allow conclusions.

Well, Stefan just asked why I linked to Lubos' insults. That's simple to explain, I didn't read what Lubos wrote since it usually isn't worth the time. I see now however that he refers to a post of mine that allegedly discusses my papers. Ironically, if you do as much as read the first two paragraphs of that post you will find that it is explicitly NOT about my papers. As I have pointed out many times, here and in dozens of talks that I have given, our model does not have an energy dependent speed of light. Which, if someone bothers to look, is also mentioned in Lee and Giovanni's paper, see page 9 top, reference [44,45]. (In contrast to what they say though, I don't think such a dispersion relation is a "special" choice but the only consistent choice.)

That isn't exactly what I meant, though it goes into the right direction. What I am saying is that for the effect of a freely propagating photon the relevant counting of orders of magnitude should not be the ratio of the energy of the propagating photon, E, and the Planck scale m_p, that would enter as E/m_p, but it should be the closeness of the curvature caused by the photon to the Planck scale. You find such an estimate in the above mentioned paper. Needless to say, it's a totally negligible effect, even if you add it up over some Gpc. Best,

Thanks for the continued update with regards to discovering whether or not there is a relation between a photon's velocity relative to its energy. I can also understand how you feel when they place so much emphasise on the recording of a single photon’s reception.

Of course I’ve always wondered why physicists in general are so intent on seeing either SR or GR fail before they have something more encompassing where such results would be in precise agreement.

It just seems to me more are interested in what might be wrong, rather than concerned with what can better replace what they hope to discard. Then again this is perhaps just a knee jerk reaction from a principle guy, who finds phenomena on its own unrevealing without the offering of a reasonable explanation.

Perhaps it would bettered advised to do more data collection and less hypothesizing, as for nothing else it could save a little embarrassment down the road; or have some forgotten the fate of those aether people :-)

On second thought perhaps we shouldn’t discourage all this frenzy to publish things that might prove as wrong out of being not better supported. In a way it could act as sort of a sum over histories approach for researchers, where those who are shown to be wrong are eliminated from further realization. It would certainly serve to free up a few of the so scarce to be found positions. Also, it might be a paradigm shift, to change ‘publish or perish’ to ‘when you publish you could perish’ :-)

I’ve changed my mind again, for my suggestion doesn’t take into account theories that are not threatened by result, if they be the sort that don’t make any testable predictions. I’m just wondering then what would happen if Lorentz invariance after more thorough data collection is shown to be violated; can these theories than be eliminated since they failed to predict a violation? I guess you can be damned if you do or damned if you don’t.

I also noticed that some who support string theory are calling this latest result a victory, as if a null result in relation to an existing theory could be considered as being such. I’ve long wished that philosophy and science could be reunited, yet never at the expense of what either brings losing its usefulness.

I’ve always wondered why physicists in general are so intent on seeing either SR or GR fail

I wouldn't say that physicists in general are intent on seeing either SR or GR fail. For whatever reason however this seems to be a topic which is written about very frequently in the popular literature, this might give you a wrong impression. (It is also the topic that seems to attract most hobby physicists. Next to the idea that "quantum mechanics is wrong.")

In addition you have to understand the context. It is believed by many physicists that GR must fail in extreme regimes in order to make it compatible with quantum mechanics. Any such deviation would thus be a hint towards the long-searched-for quantization of gravity. SR however already "fails" within the solar system. The world simply isn't symmetric under global Lorentz transformations, and we also know that our universe has a preferred frame, imprinted in the CMB.

In any case, the reason for many speculations is a pressure to come up with predictions that has risen noticeably within the last decade. While I agree that a model should make predictions and as a phenomenologist I certainly welcome the trend towards phenomenology, desperately trying to cook up models that make predictions no matter how implausible isn't helpful either.

It is for this reason that I disagreed with Christine in this earlier post. One shouldn't overemphasize testable predictions if it goes on the expenses of consistency.

With regard to your last comment, please keep in mind that the modification that gives rise to an energy dependent speed of light has not been derived from LQG. Best,

I discussed this event already on my blog( which I'm not allowed to link here, indeed) before few months and it forms a nice example, when deeper insight is required, if LQG fails in a way, which appears to support string theory.

LQG could still explain this observation, if it could handle interactions of many photons at the same moment, but AWT model is still more intuitive here: when we shake vacuum foam, it becomes more dense, so that pulse wave is propagating through it like wave packet, i.e. like soliton or dense area of vacuum, which is focusing another photons into it.

The whole trick is, the farther gama ray flash is, the more intensive it should be - so that the most distant flashes are propagating through space like photonballs (in analogy to glueballs, known from nuclear physics), i.e. like giant soliton, which is kept together like vortex ring without dispersion.

In accordance with this explanation, the dispersion of more close gamma ray flashes is much more pronounced, then at the case of these remote ones. So it may be even possible, the lone photon observed was trapped into gamma ray flash during its travel through wast cosmic space.

So, this phenomenum is not exclusive to non-stringy theories. In fact, it maybe quite general.

It is interesting that Mavromatos fits a gaussian on arrival times of photons.

Similarly, at least I think I can, clearly visualy see also a peak on the delay of the arrival time of GRB090510. I guess a good exercise to this is to correlate different peaks in a graph with energyxdistance, and see if can accumulate enough points from a lot of different GRB and see if there is really a stochastic delay.

In general relativity it's believed, high energy photons doesn't gain it mass with frequency, just momentum only.

This approach may be consistent with formal understanding of relativity and refusal of Aether concept - but as we can see, in this particular case is completely misleading. We can imagine the photon ball formation like cluster of photons, which are tied together by their own gravity field - which is indeed impossible, if photons would have energy and momentum only.

Dogmatism of general relativity proponents is trapped into its own caveat here, because it prohibits them to explain MAGIC observation in intuitive way, while they're facing critique of string theory proponents. In AWT dynamic mass of photon is real mass with all its consequences, including of condensation of high energy photons into photonballs, so it can explain the same phenomena in many consistent ways.

/*..In fact, it maybe quite general...*/String theory uses Lorentz invariance as one of its fundamental postulates, being dependent of quantum mechanics and special relativity. No rigorous theory can derive violation of its own postulates in rigorous way.

String theory isn't rigorous theory in any way - so it may derive violation of Lorentz symmetry by quiet omitting of this postulate from derivation, nevertheless we should understand Motl's post in deeper context. Every violation of Lorentz symmetry effectively renders string theory invalid.

But in this particular case we aren't disputing symmetry violation of single photon, but whole cluster of photons, which is indeed quite different situation: at the scope of such cluster individual photons may move randomly along different paths - while they're still keeping the shape of cluster as a whole, so that Lorentz symmetry remains maintained at the cluster level.

If I imagine it well, then the LQG can be reconciled with string theory in such a way, we imagine segregation of photon speed in cluster in such a way, heavy gamma ray photons are sitting at the center, while faster X-ray photons are revolving this center along longer path like particles in vortex ring. This could have testable impact to the distribution of energies along time axis of gamma ray flash: heavy photons should appear at the middle of Gaussian curve preferably.

Yes, Mavromatos also gave a talk at our 2007 conference, recording here, and we also discussed that at the previous post. His model however is equally not derived from string theory as DSR is from LQG, thus no conclusion can be drawn in either case. Best,

I was just about to reply to your comment, but it seems Stefan disliked it. I have told you many times, on this blog, on your blog, and by email that the model I've been working with does not have an energy dependent speed of light. You can very easily verify this by looking at my papers, it is also explicitly mentioned in Lee and Giovanni's paper, and I wrote several blogposts on it.

You fail to understand this fact even when communicated to you in plain English, you continue to assign opinions to me that I don't hold and have never held, and you continue to claim my papers contain statements they don't. The only reason I can see for this absurd behavior of yours is that you are desperately trying to make me look stupid. The only result however is that it makes you look stupid.

The Large Area Telescope (LAT) detects gamma rays by using Einstein's famous equation (E=mc(squared)) in a technique known as pair production. When a gamma ray, which is pure energy, slams into a layer of tungsten in one of the Tracker's towers, it can create a pair of subatomic particles (an electron and its antimatter counterpart, a positron). The direction of the incoming gamma ray is determined by projecting the direction of these particles back to their source using several layers of high-precision silicon tracking detectors. A separate detector, called a calorimeter, absorbs and measures the energy of the particles. Working on gamma ray at a time, the LAT will make gamma-ray images of astronomical objects, while also determining the energy for each detected gamma ray.

Sorry, you are right, I meant this post. I didn't mean you said consistency isn't relevant. I just meant to say it can be a side-effect of putting too much emphasis on pheno. In any case, I think we discussed that on your blog or somewhere, and I think we found some common ground. It just crossed my mind when I wrote the above that we had talked about this recently. Best,

Gamma ray spectrum being mapped sort of sets up the idea for a very dynamical relation to how one sees that new window/view on the universe. Your imaging is different, no?

The calorimeter design for GLAST produces flashes of light that are used to determine how much energy is in each gamma-ray. A calorimeter ("calorie-meter") is a device that measures the energy (heat: calor) of a particle when it is totally absorbed. CsI(Tl) bars, arranged in a segmented manner, give both longitudinal and transverse information about the energy deposition pattern. Once a gamma ray penetrates through the anticoincidence shield, the silicon-strip tracker and lead converter planes, it then passes into the cesium-iodide calorimeters. This causes a scintillation reaction in the cesium-iodide, and the resultant light flash is photoelectrically converted to a voltage. This voltage is then digitized, recorded and relayed to earth by the spacecraft's onboard computer and telemetry antenna. Cesium-iodide blocks are arranged in two perpendicular directions, to provide additional positional information about the shower.

Dorigo mentions an idea with regard to the future in relation to high energy calorimetry measures and in LHC this is a newer model for apprehension while the future points us into the "right direction." Not just swift guidance turning, but directly? Specifically, the right direction

You rely on colorimetric:) evidence for this understanding.

It is a "short conceptual change" to image Lagrangian perspective to realize the gravitational inclination exists now for how one views that same evidence? You see?

Yes for sure there is an assumption here and some will know what that is, while others are contend to describe just the state of the photon.

So indeed we see where perspective is being driven too, while a calibration is needed for a future consistent measure of this differentiation between colorimetric versus calorimetric:)

Yes, nice gamma ray sphere of the universe or earth.

Parentani showed that the effects of the fluctuations of the metric (due to the in-going flux of energy at the horizon) on the out-going radiation led to a description of Hawking radiation similar to that obtained with analogue models. It would be interesting to develop the equivalent formalism for quantum analogue models and to investigate the different emerging approximate regimes.

So one sees differently now assuming "analogue models" for perceptions about our existing universe. Is there a "calibration point" in regard to a blackhole measure?

My rough understanding of some motivation for adjusting simple SRT, as rough similarity to the issues of this post: if there is some "problem" with a photon having the Planck energy or higher, that clearly is relative to reference frame in SSRT (simple SRT.) Hence there must be some way for photons to effectively never be "seen" with that energy. The Planck energy includes "G", hence a connection to quantum gravity.

Also, if space has some "grain" due to compact dimensions, perhaps such extreme energy photons would diffract off the "structure" provided by those curled up dimensions. IOW, empty space would tend to diffuse such radiation as if it were milky.

Oh, I realized that for compact dimensions in empty space to diffract and diffuse super HE photons has the same problem, of photon energy being relative to a reference frame (or you would need some absolute standard of rest for the universe, or for velocity to have some absolute character.) I think that any adulteration of SRT would goof up things like energy conservation, if e.g. photons were used to pass energy from one frame to another etc.

I stop at three in a row. Here's a question I left at Lumo's joint, after similar comments as mine above (boy is he crabby about a lot of things):

The issue about "diffraction" does bring up a good question, so how do we make compact dimensions Lorentz invariant? They clearly can't be like a literal "structure" that an ordinary material would have, because that would institute a standard of "rest" in the universe. tx

Neil, I don't know what the issue with compact dimensions is supposed to be. As to your first comment, that is indeed the motivation usually given but I'd want you to reflect on the fact that you cannot "see" anything without interacting with it in some way. Thus, talking about some problem with a freely propagating photon isn't meaningful because for there to be any observable problem you'd have to interact with it. I wrote a paper on the difference between these two points of view see here, which explains the difference between the line of thought leading to my model and that leading to the usual DSR. I did my best to make it accessible to the reader so give it a try.

Thanks Bee. OK, point taken re need to observe, but I am thinking in the terms of "what if someone was there to see it ...", a common counterfactual conditional stance. That CC stance is sometimes criticized, I know.

Compact dimensions: in some pop sci treatments, we see a bunch of rolled up circles etc. That leads to an intuitive notion of a grain to the space, that it might be like bubbles etc. But I can think of it as like a sheet with actual thickness (in different directions, not just one more dimension of thickness) and the issue of actual structures wouldn't occur.

I think I was actually remembering descriptions of "quantum foam" in space-time itself, not the extra dimensions. In the QF idea, space is warped into little pockets and bubbles (again, per pop sci tropes which are oftne misleading.)

It seems to me that could be like a grain that would have an actual structure as a function of position, however small and however temporary. If so, the configuration could be defined relative to velocity, just like a "real structure." Maybe it could diffract very short wave length light etc. But that has issues of Lorentz invariance, which takes us back to this whole argument. However, I suppose I am imagining it too literally and simplistically.

Well, I recommend you try not to think of it as having an actual thickness. It's pointless because the thickness itself would also be described as another dimension. You break Poincare invariance if you embed structures into the higher dimensional space, which is quite common to many brane-world scenarios.

/*..break Poincare invariance if you embed structures into the higher dimensional space... */This is just why theories, which are relying to extradimensions and Poincare invariance and Lorentz invariance (Lorentz group is subgroup of Poincare one) simply inconsistent BS. Existence of extradimensions would manifest itself just by violation of Lorentz (and therefore Poincare) invariance.

Indeed LQG goes just one step further, because the assumption of spin foam violates equivalence principle. So while LQG can explain violation of Lorentz invariance less or more sucessfully, it cannot explain violation of Lorentz symmetry violation at both large distance and energy density scales. Which is interpreted by people like Motl by competence of string theory.

Note that we are talking about things, which can be modelled by dispersion of waves on water surface all the time. This is not rocket science.

Zephir: There is absolutely nothing inconsistent with the fact that adding a structure to an otherwise flat and empty space breaks Poincaré invariance. If you do so much as put a grian of sand into Minkowski-space you also break Poincaré invariance. In addition, extradimensions per se do not necessitate such procedure, and some models with extra dimensions don't need such extra structure (the old Kaluza-Klein theories eg).

/*..adding a structure to an otherwise flat and empty space breaks Poincaré invariance..*/

If we consider CMB field, vacuum can never form flat space so its P/L invariance would be always violated. After all, this curvature makes observable Universe limited and closed.

It means, string theory cannot explain CMB field, although it uses it in many extrapolations (L. Motl: "..newest Fermi result means that the symmetry properties resulting from a smooth, continuous, Lorentz-invariant spacetime..."). Under Lorentz invariance, photons would have zero rest mass and they could never interact, couple and materialize mutually, thus leading into fully transparent Universe.

The residual controversy consist of point, whether to consider CMB as a part of space-time or not (it can be shielded by conductors, by superconductors in particular).

/*..obviously wrong that particles with zero rest mass cannot interact...*/Such particle would be of zero surface curvature, so it couldn't exist at all (it would become larger, then observable Universe, which is of limited diameter). In fact, with respect to small positive curvature of CMB field such particle would be effectively a tachyon and it would escape from our Universe already, instead.

Local Lorentz invariance has no meaning in discussion about gamma ray dispersion at large distances - or not?

Zephir: What you say does not make any sense. For one because particles don't have a surface curvature, second because curvature is determined by the stress-energy tensor and not by the rest-mass, third if you open a random textbook on GR or electrodynamics you'll find frequently used examples for theories with interacting particles of zero rest-mass that won some Nobel-prises.

/*..curvature is determined by the stress-energy tensor and not by the rest-mass...*/The largest photon at rest state, which fits inside of observable Universe still has a diameter/wavelenght ~ 40 billion light years, so it's energy is not zero, so its Tuv tensor is not zero..

Well, the fact LQG cannot explain GZK limit violation is just in the fact, it considers zero rest mass of photon. String theory considers zero dynamic mass of photon, so it suffers larger problems and short scale and larger one at cosmologic scale. Problem of these theories is in minute details, which cannot be neglected when extreme energy density propagates at extreme distance.

The point of string theory is, it considers zero mass of photon, which is true just for CMB field. In CMB field of tiny positive curvature even the curvature of microwave photon with finite mass is effectivelly zero (such photon would vanish in CMB field fast, so that every light wave of such frequency propagates here like "pure" harmonic wave without photons, thus maintaining Lorentz symmetry).

So that L. Motl can pretend, he's using Lorentz invariance correctly just at the CMB scale.

Zephir: Indeed, the photon doesn't have zero T_munu, is exactly what I said. As to best present knowledge the GZK cutoff is exactly where it's supposed to be, so nothing to explain. Reg Lorentz Invariance, what I said above is you are confusing local with global Lorentz Invariance. String theory does certainly not predict global Lorentz invariance.

/*..the photon doesn't have zero T_munu, is exactly what I said..*/Let it be so... And my point was, photon of wavelength in 40 billion light years cannot move in Universe, so it's in its rest state effectively. And while such photon still has nozero T_munu, it's rest mass cannot be zero as well.

If your theory isn't working at level of most trivial logic, it cannot work on every more advanced level - no matter how many tensors, twistors and spinors you would use. If epicycles theory doesn't play with Venus phases well, it's irrelevant, it can compute eclipses and conjunctions - something is still wrong with it.

Zephir: a) you can have static electromagnetic fields despite the fact that the photon is massless b) the size of the observable universe is not commonly considered to be the total size of the universe c) even if it was, there is nothing inconsistent about having an IR cutoff d) you seem to have accumulated a lot of misconceptions. I would really recommend you do carefully read some standard textbooks on electrodynamics and GR before you further waste your and other peoples' time. Best,

/*..I would really recommend you do carefully read..*/OK, but why I should do it, if it didn't helped you to explain me the problem of photon rest mass? Anyway, it's not problem to propose logically consistent theory, in which mass of photon is not problem, but a feature. Bye, Z.

Zephir: It might help you to understand what I have explained, namely that there is no problem with a zero rest-mass particle (photon or otherwise) interacting. The mass of the particle is not decisive for the interaction term, in fact, the SM would be much easier if all particles were massless. It is correct that if our space-time was of finite extension (I doubt you'd find many physicists who find it plausible though the universe ends at the Hubble horizon) then there could be no particles with wavelengths larger than that. However, that the axioms of electrodynamics would not contain such a knowledge about a finite maximum wavelength, should it exsist, is not a logical inconsistency.

/* ..there is no problem with a zero rest-mass particle (photon or otherwise) interacting...*/Indeed, if we postulate, here's no problem with it, then here's no problem at certain level of understanding. We can even build a consistent formalism based on it, separating mass term into energy term and momentum term consequently. For every theorist the theory of his personal preference appears internally fully consistent, because it was build so.

But as I explained above [5:28 PM, August 16, 2009], under certain situation such approach poses a problem, especially when considering another theory, which was developed from different (thus inconsistent) postulate set. If these postulates would be consistent, we could replace them by original postulate set, isn't it true? Therefore we will always face inconsistencies, when mixing different theories - and we can't do anything against it. Every attempt for reconciliation of different theories without postulation of more general theory, which handles existing one like special cases is futile in advance.

The problem of string theorists and LQG theorist is exactly the same in this context: they proposed combination of existing theories, but no general theory, which would explain postulates of existing theories. Insteas of it, they proposed low dimensional constrains (loops, strings), which attempted to minimize mutual inconsistency of existing theories, i.e. regression of existing regressions.

/*..a theory is either consistent or it isn't...*/Well, exactly. And for its author is usually consistent. For authors of another theories is usually inconsistent. For example an ordinary relativists falling into black hole can believe in constant speed of light even at the case, when he is revolving black hole at place together with another photons. From his local perspective nothing changed. But this scale isn't larger then few meters by now.

Indeed, a remote observer from outside can clearly see, the motion of light has stopped here. Who is true by now?

As a less dramatic example can serve gravitational lensing. During lensing we can clearly see, light spreads in different speed when moving around massive object, so that Lorentz symmetry is apparently violated here. Relativist can say: "nope, it's the space-time, not the path of light, which is curved here". But how he can prove it? He didn't put a clock into gravitational lens, his interpretation is separated casually from reality and he has no factual evidence for his stance. He just believes in it.

/*..there is no reason why a different set of postulates necessarily must be inconsistent...*/Indeed it is - if it wouldn't, we could substitute it and replace by single set of postulates, so no difference would exist here anymore. This is basically what the Goedel's incompleteness theorem is about...

..OK, I'll stop with another reasoning, but your strictly formal approach is clearly inconsistent in many areas. You're living in world of atemporal math, where all theorems are valid at the same moment. But in real world here's not such thing like "single moment" for all stuffs. Light can travel through causual foam (spin network) along many directions at the same moment and here's no single time arow. After all, photons revolving soliton of gamma ray are demonstrating it clearly.

I wonder how particles of zero rest mass could interact in a conventional way, because any field from the luxon would not have a specifiable relativistic transformation (the way from e.g. an electron would.) Also the issue of transformation of lateral force, proper time for reaction etc.

But if not thinking in classic relativistic terms, we know e.g. that light can react with light (true) at very high energy densities.

Zephir, there is no problem with a particle just having "zero rest mass" per se. The particle does have energy, just not a defined rest mass due to it being a luxon (goes at the speed of light.) Please re-read some basics and clarify some things for yourself.

As yourself Bee, I was very astonished to read that most of the conclusion on the constraints on LIV was based on the detection of a single photon. I believed we were in the realm of highly sophisticated statistical results on thousands of measurements, at least in the domain of particle physics, and I see that in astronomy they are using single measurement to draw conclusion. Isn't something wrong there or am I too stupid to understand?

/*..something wrong there or am I too stupid to understand...*/It's indeed silly to extrapolate relevant conclusion from single photon observation - but GZK limit violation and lack of dispersion is rather common for energetic gamma ray flashes. In addition, a close gamma ray flashes are more dispersive, which is counterintuitive in certain extent (they're weaker in general and they should be shorter in general).

Zephir, I am not sure what you're getting at. As I said, luxons have energy but no "rest mass" because they can't be at rest for any observer. This does not imply anything about *vacuum* dispersion, the point of the photon from GRB.... Some in the past claimed some delays in arrival time which they thought could be a case of dispersion, now that's less credible due to complications and the new observation.

Don't confuse GZK, an effect of the cosmic background radiation on charged particles (only?) with intrinsic vacuum dispersion of very energetic photons. Most or all "cosmic rays" are "particles" and not gamma photons, sayeth Wikipedia:

Cosmic rays are energetic particles originating from outer space that impinge on Earth's atmosphere. Almost 90% of all the incoming cosmic ray particles are protons, almost 10% are helium nuclei (alpha particles), and slightly under 1% are heavier elements and electrons (beta minus particles).[1] The term ray is a misnomer, as cosmic particles arrive individually, not in the form of a ray or beam of particles.

2. Statistics needed in science? Not necessarily. A single good observation proves a point if the outcome is different than a theory requires. Vacuum dispersion would not be statistical, it is a function of the energy? - but wait, IIRC Bee says that the density matters and not just one photon energy by itself.

Well, that changes everything, but wouldn't the experimenters reporting this and the supposed theoretical implications (that DSR is wrong etc.) be aware of that? Someone please clarify.

I don't think there's anything wrong with the analysis itself, one just shouldn't draw wrong conclusions from it. If you look at table 2, you will find that the limits depend - as one can expect - sensitively on the time at which the photon was assumed to be emitted. The only thing one can be confident about however is that the photon wasn't emitted before the burst started, which leads to a limit of the order Planck mass. The other ones associate the photon with emission at later peaks, leading to tighter bounds. Fact is however, they don't know when the photon was emitted. Thus, I think the only way one can arrive at a meaningful bound is to sample more data and get a better understanding of the regularities of the source (eg the peak structure, duration, etc). Until then associating high energy photons with one or the other peak is just guesswork. Best,

You are obviously right, and the way they have written up this result borders on the scandalous. The sad thing is that they knew that they could get away with this simply because they were telling people what they wanted to hear.

I once heard a senior professor explaining to a junior how to get papers accepted. "Bring only good news" was the sage advice. Tell them what they want to hear.

We can say easily, photon condensation is evidence of nonzero rest mass of photon and AWT model - but this is not exactly, what the rest of scientists wants to hear, so that the situation is rather symmetrical with respect to both AWT, both string theory. Both theories are happy in their own way, while LQG has nothing to say about it.

Hi,I wonder if I could clarify some issues raised here. More details and references are in a recent paper by Amelino-Camelia and myself in arXiv:0906.3731.

First, LQG in 3+1 dimensions has not been shown to break or deform Lorentz invariance. There were some papers, starting in the 90s, studying excitations of non-physical ansatz’s for vacuum states (ie that didn’t satisfy the quantum constraints) that showed evidence for Lorentz symmetry breaking. These were not prediction of LQG, they were consequence of an ansatz for the ground state that broke both Lorentz invariance and diffeomorphism invariance. They could be characterized as exploratory, but very far from definitive. There is no definitive result concerning the symmetry of the ground state in LQG in 3+1 dimensions.

So, unfortunately, it is not correct to claim that this or any such result rules out LQG. I say unfortunately because it would be nice if we knew what the prediction was of LQG for deformed dispersion relations, but despite some effort we don’t.

I have published two papers arguing that a form of DSR is a consequence of generic quantum theories of gravity in the semiclassical approximation, plus certain scaling assumptions: hep-th/0501091v2, arXiv:0808.3765v1. These derivations make several assumptions, particularly as to the scaling dimensions of certain operators, which have not been confirmed in LQG or any theory. If there is no linear dispersion (order l_Planck) then we learn that one of the assumptions of these arguments are wrong, and my guess is it would be these scaling assumptions.

One can also deduce the significance of these scaling relations from some general considerations that derive kappa Poincare symmetry from quantum deformed (A)dS symmetry, this was shown in hep-th/0306134 and hep-th/0307085.

In fact, in 2+1 dimensions the argument from quantum group theory is correct and the low energy symmetry is kappa-Poincare (hep-th/0512113, hep-th/0502106). This suggests its not crazy that to hypothesize that the same is true in 3+1 but this is not a proof, it is a suggestion of a line of argument.

Whether string theory allows deformed Poincare symmetry is unknown, in hep-th/0401087, Magueijo and I showed that there are consistent free bosonic string theories with deformed energy-momentum relations, to my knowledge no one has followed up to investigate what happens to this when interactions are included. Otherwise string theory assumes perfect Lorentz invariance.

Further, Lorentz symmetry breaking at order l_Pl is already ruled out by several orders of magnitude by observations of polarized radio galaxies which constrain the bi-frengence from the parity odd term in the effective action for Maxwell fields that appears at dimension five. What might be the case, but is now somewhat constrained, is parity even deformation of Poincare invariance. It also should be emphasized that order l_Pl^2 effects are not strongly constrained by any observation so that there could still be Planck scale Lorentz symmetry breaking at that order.

Coming to the recent observations, that Fermi was capable of putting order L_Pl limits on dispersion has been clear for a while and was discussed in detail by Amelino-Camelia and myself in arXiv:0906.3731. My reading of the recent Fermi collaboration paper on GRB090520 is that the conservative bound of about 1.2 M_Pl is reliable, while the stricter limits are based on assumptions about the sources which are at this time speculative. I might make a couple of other comments on the results in this important paper.

- -Even the most conservative bound > 1.2 M_{Pl} conflicts with the claims of Ellis et al in the Magic and subsequent papers to make a measurement of an effect of around .1 M_{Pl}.

-They also give a very interesting bound on the advanced case, s=-1 which is also around 1.2M_{Pl}. This is much better than the best bound so far which so far as I know is the one in Giovanni and my paper: 3.2 X 10^17 GeV .

Lee, thanks. Questions: Do you think it risky to draw conclusions from a single photon, albeit a good "catch" AFAICT?

As for your referencing various dimensional spaces, how unique is proper "extrapolation" of our physical principles to D = (N + 1) spaces, and are there any intrinsic "contradictions" in any (not to be confused with "inconvenient" features like unstable atoms or solar systems)? A brief note is OK, tx

Sure, one photon might be an error, eyeballing the data it looks like there are several other photons from which they might get a conservative bound of a factor 3 to 5 less. And this is just from one GRB. As we discussed in the paper with Giovanni, with a GRB with photons in the Gev range coming roughly once ever 6 weeks it is only a matter of time before another chance to set a conservative bound of order M_{Pl} arises.

Gravity in 2+1 gravity is very special and different from 3+1, in several ways. Until proven otherwise any feature proven there is ikely an artifact of the special properties of 2+1 dimensions. Nonetheless there are some interesting things we have learned that are being applied to 3+1 but it is too soon to tell if these work, and it would require a technical discussion to explain them.

In my opinion this lone photon was trapped into photonball from outside or even worse: it could serve as its condensation nuclei in similar way, like particle of dust enables molecules of water to condense into droplet. It would mean, the occurrence of photons of unexpectedly high energy density inside of gamma ray flashes isn't accidental here at all.

A comment I have deleted for reasons of personal hygiene points out an older remark by Lee, from October 2003 in the Edge:

... When we first worked out the predictions for these smallest units of area and volume, we had no idea that they would be observable in real experiments in our lifetime. However, a number of people—beginning with Rodolfo Gambini, of the University of the Republic in Montevideo, and Jorge Pullin, then at Penn State—showed that there are indeed observable consequences. At about the same time, Amelino-Camelia and others were pointing out that if there were such effects, they would be detectable in experiments involving cosmic rays and gamma-ray bursts. These effects are caused by light scattering off the discrete structure of the quantum geometry, analogous to diffraction and refraction from light scattering off the molecules of the air or liquid it passes through. The quantum gravity effect is tiny—many orders of magnitude smaller than that due to matter. However, we observe light from gamma-ray bursts—huge explosions, possibly caused by mergers of binary neutron stars or black holes—that has traveled across the universe for some 10 billion light-years. Over such long distances, the small effects amplify to the point where they can be observed. Because elementary particles travel as waves in quantum theory, the same thing happens to such particles—protons and neutrinos, for example. It is possible that these effects may be responsible for the surprises I mentioned in the observations of very-high-energy cosmic rays.

Now, here is the really interesting part: Some of the effects predicted by the theory appear to be in conflict with one of the principles of Einstein's special theory of relativity, the theory that says that the speed of light is a universal constant. It's the same for all photons, and it is independent of the motion of the sender or observer.

How is this possible, if that theory is itself based on the principles of relativity? The principle of the constancy of the speed of light is part of special relativity, but we quantized Einstein's general theory of relativity. Because Einstein's special theory is only a kind of approximation to his general theory, we can implement the principles of the latter but find modifications to the former. And this is what seems to be happening!

So Gambini, Pullin, and others calculated how light travels in a quantum geometry and found that the theory predicts that the speed of light has a small dependence on energy. Photons of higher energy travel slightly slower than low-energy photons. The effect is very small, but it amplifies over time. Two photons produced by a gamma-ray burst 10 billion years ago, one redder and one bluer, should arrive on Earth at slightly different times. The time delay predicted by the theory is large enough to be detectable by a new gamma-ray observatory called GLAST (for Gamma-ray Large Area Space Telescope), which is scheduled for launch into orbit in 2006. We very much look forward to the announcement of the results, as they will be testing a prediction of a quantum theory of gravity.

As I see from the original paper of Gambini and Pullin (gr-qc/9809038) their calculations are based on a number of assumptions and on the construction of certain semiclassical weave states. Moreover they specifically state in their paper:

"We will see however, that the nature of the effects predicted by loop quantum gravitydepend on the type of semi-classical state that one considers. In a sense, one can turn the argument around andsuggest that rather than viewing these effects as a prediction of the theory, they can be used to constrain the type ofsemi-classical states one considers to represent realistic cosmologies."

"calculated how light travels in a quantum geometry and found that the theory predicts that the speed of light has a small dependence on energy"

Rovelli on the other hand is very cautious regarding this prediction and uses a completely different language. Quote from his book on Quantum Gravity:

"Gambini and Pullin have introduced the idea to study the propagation of matter fields over a weave state, taking expectation values of smeared geometrical operators. For suitable weave states, this may lead to the possibility of having quantum gravitational effects on the dispersion relations."

/*..special theory of relativity, the theory that says that the speed of light is a universal constant. It's the same for all photons, and it is independent of the motion of the sender or observer...*/The trick is, special relativity "doesn't know" about photons at all. Photon is concept of quantum mechanics and special relativity has nothing to say about it. Special relativity was designed and remains valid for light wave, which is abstract artifact without shape and size, filling the whole Universe. In AWT such abstract artifact is real only for microwave of wavelength at CMB scale, where every photon vanishes fast in CMB radiation. In such environment the speed of photon is effectively equal the speed of light wave, so we can use special relativity, because inside of such environment no light photons can be distinguished from CMB background, i.e. they cannot "exist".

Anther source of confusion consist in point, LQG is based on general relativity and general relativity is different theory from special relativity in the point, it doesn't uses light speed invariance in its postulates. If two theories are using different postulate set, they're expected to violate mutually in less or more distant perspective. How is that possible?

It's generally known, special relativity is valid only for inertial reference frame. Gravity field isn't such reference frame and special relativity can be used here only if we maintain the condition, object is moving along gradients of gravity field (like the plane in Hafele-Keating experiment). The gamma ray photon, which are repeatedly crossing tiny density fluctuations of gravity field caused by gravitational waves of CMB are not such case definitely. So we shouldn't adhere to nonlocal Lorentz invariance here at all, because photons are moving along straightest path in four dimensions, not along strightest path in 3D space deformed.

These effects are caused by light scattering off the discrete structure of the quantum geometry, analogous to diffraction and refraction from light scattering off the molecules of the air or liquid it passes through. The quantum gravity effect is tiny—many orders of magnitude smaller than that due to matter.That's the corrected point I made here and at The Reference Frame, following up on compact dimensions (which is a different issue than quantum foam in (3 + 1), but perhaps with relations.) I said, that high-energy photons might diffract off inherent micro-features of space itself. The photon energy would be around the Planck energy, as discussed here.

This scattering effect is not the same issue as vacuum dispersion, since the latter is directly defined in terms of velocity and not scattering. However the slowing effects could be related

Note that since photon energy is frame-dependent, we need some adjustment to SRT for this to make sense. Space is not supposed to be like a diffraction grating sitting there, with a given rest frame. Otherwise, my velocity may be parallel and positive to the HE photon, and Doppler shift makes the photon less energetic. Then, I couldn't account for the scattering/dispersion. In DSR schemes, moving fast adjusts your standard of c a little to make it self-consistent per all motion being relative per se.

Zephir, GR does postulate constant c, it is just c as defined in the immediate small vicinity of an observer. A very tiny light-reflection clock is still accurate, since the distance scale can be defined locally and accurately and we use local time.

Note this produces odd large-scale effects: Radar reflected off Mercury takes longer to get back, because of both curved space "dimpling" AND that time is slower near the Sun. So when we plot the progress of the radar photons by our standards, we imagine them having a longer path over the dimple, and also going slower per red shift of time when they are near the Sun.

/*..LQG in 3+1 dimensions has not been shown to break or deform Lorentz invariance..*/

Why not - but is the 3D space filled by tiny density fluctuations still strictly three-dimensional? IMO it's three-dimensional only for tiniest particles, which are able to follow subtle density fluctuations along local geodesics - or only for these sufficiently large ones (like large photonballs), which tend "to ignore" them as a whole.

For particles of mediate size such space becomes effectively higher dimensional and more complex, then just 3D - so it can violate Lorentz symmetry in "observable" way - while still keeping LS expected for higher dimensional space-time.

It’s clear from Lee’s comment that LQG in itself does not mandate a violation of Lorentz invariance and obviously it doesn’t mandate it be maintained, or there would have been no speculation on the matter. I would ask you or anyone I would recognize as being better qualified, if string theory has this as a simply yes or no mandate in regards to its formalism? I anticipate however from what I’ve gathered so far this is not the case either. What I’m saying, if my question regarding string theory is answered the way I suspect, then I fail to see what all the fuss is about.

Just as a follow up is to ask, what if it turned out to be with subsequent observations, that there is no violation to be found within a significant limit; would this then not strengthen the case that space-time not to be a entity that exists as quantilized, but rather it be of an indivisible nature.

Of course my motivation in asking such a question is that the current most widely accepted ontology for nature within physics is that it need be a monistic one, where the wave function in essence represents to be the total description of the world. So I guess my speculative question really is to ask, if the Lorentz invariance stands, would this not make a case that quantum theories which imply a dual ontology need to be more seriously considered as a route to what’s so often referred to as unification; yet rather in this case imply seeking a model that would allow for a coexistence of the theories.

/*...if string theory has this as a simply yes or no mandate in regards to its formalism...*/No rigorous theory can predict violation of its own formal postulates in atemporal way. From theory considering 1=1 you cannot derive 1=2 at the very end. And Lorentz invariance belongs into fuzzy sets of postulates of various stringy theories - we can even say, it's least common denominator of these postulate sets.

Indeed, string theory could easily model violation of Lorentz symmetry in inhomogeneous 4D space-time simply by declaring it a higher-dimensional in the same way, like LQG - the only problem is, scientists on both sides of ST/LQG duality still didn't realize it, while stil seeking for signs of both extradimensions, both Lorentz symmetry violation - although they dealing with them all the time.

Fortunately I always weigh any answer offered by first accessing if the response given actually addresses the question. My question weighed upon space-time remaining indivisible, while what we recognize more commonly as the quantilized elements of reality remain so. There has been much attention drawn to the fact that LQG as not being a background dependant theory, while string theory is. Therefore from an ontological perspective, at least on the surface, this would seem to suggest that LQG to be monistic, while string theory being dual. Of course from what I can gather this scaffolding that Lee insists exists in string theory has never been addressed in a physical sense, as to what role it plays or the implications it brings.

There's an interesting selflocking mechanism: string theorists (ST) could introduce Lorentz symmetry (LS) violation by considering of extradimensions, but they hesitate to propose it, because LS belongs between ST postulates in 4D space-time, while LQG proponents could introduce extradimensions by considering Lorentz symmetry violation, but they grudge against it, because they proposed LQG as just "4D theory" originally.

At the moment, when both sides are taking half of grant support, no one wants to start the reconciliation of both theories by considering of ideas of the dual theory. In such a way both sides are effectivelly locked inside of iwory towers of their own prejudices. I presume, this example situation explains a lot, how symmetry breaking is working at conceptual level.

I presume, theoretical physicists should be payed for reconciliation of existing theories be decreasing of number of postulates, instead of for development of new ones by increasing of number of existing postulates.

/*...my question weighed upon space-time remaining indivisible, while what we recognize more commonly as the quantized elements of reality remain so..*/

Yes, space-time is quantized. No, these quanta are formed by another quantas, recursively. No, the observability of these smaller quantas decreases with their size. No, we can increase visibility of these quantas arbitrarily by widening of space-time or energy density scale used in experiments. No, we cannot do it in arbitrary way, being limited by size and energy density of our Universe. No, we can consider the building of equipment in parallel Universes in distant perspective... No, from even more general perspective no parallel Universes exist...

Zephir: I just deleted your last comment. I've told you several times already this is not the place to advertise your theory of whatever, I don't tolerate links to sites I don't approve of, and I'm damned tired of your babbling, so please stop it.

As I said to Plato in another of your threads, that when examined closely for science, it is nature who stands as being the accused and not the scientist. I then find it strange when some in your ranks are then such mistaken to confuse each other as to serving in being such. I would suggest then when such people act as they do, that it is not a blindfold that’s required when they fail to note this important distinction between science and justice , yet rather a muzzle. I’m reminded in this case that justice is then fair to be used when science is forsaken :-)

"Yes, there are motivations, and indications that within certain scenarios one can have such a dispersion [...] but as the authors state themselves "The calculation of such effects is beyond the scope of current theoretical methods".

"The idea is that quantum gravity could affect the propagation of photons; we previously discussed this possibility here and here. Such modifications are quite generic and appear in various approaches to describe quantum graviational effects, some of which are inspired by loop quantum gravity, some are inspired by string theory, some are inspired by God-knows-what. There are however no derivations of such effects from a fundamental theory, at least not yet. At the present status, one should thus understand these models as examinations of specific features the to-be-found underlying theory could have."

My wording in the transcript of the EDGE interview is indeed imprecise and I am sorry about it. I should have inserted a sentence indicating that the prediction of Gambini and Pullin depends on a particular choice of state as well as theory. What you have here is not a difference of opinion between Rovelli and myself, it is the difference between the precision one can impose when writing for other scientists as opposed to an edited and condensed transcript of a verbal interview. At this remove it would be hard to reconstruct whether I neglected to make the appropriate caveats, or did so but they were edited out, but in any case the result is my responsibility and I certainly regret I did not insist on a bit more precision in the final text.

At the same time, while the text is imprecise, I think it is clear that I am not using this interview to advance a claim that LQG makes some specific prediction. Indeed, a bit later in the interview I note that the prediction of Gambini and Pullin mentioned in the quote you used is already ruled out.

“Recently, people have understood that this possibility appears to be ruled out by experiments that have already been done: that is, if the principle of relativity fails when quantum gravity effects are taken into account, effects would already have been seen in certain very delicate measurements involving atomic clocks and in certain astrophysical processes involving supernova remnants. These effects are not seen, so this drastic possibility seems less likely. So a hypothesis about the structure of space and time on scales twenty orders of magnitude smaller than an atomic nucleus has been ruled out by experiment!” Here I am referring among others to the paper of Glieser and Kozemeh, gr-qc/0102093, which in fact ruled out the predictions made by Gambini and Pullin. So I think a fair reading of the transcript is that I am communicating my excitement about the general fact that there could be quantum gravity effects which will soon become observable.

By the way, were it the case that I had believed in 2003 that LQG made an unambiguous prediction and now had changed my mind I would not be embarrassed to say so. This is the nature of science.

One reason I am sorry about this is that in my writing, whether scientific papers or for the public, I’ve worked hard to be precise. For example, on p 237 of TTWP I am clear that there is as yet no prediction from LQG for the symmetry of the ground state. “What about other (than string theory) approaches to quantum gravity? Have any predicted a breakdown of special relativity? In a background-independent theory, the situation is very different, because the geometry of space-time is not specified by choosing the background. That geometry must emerge as a consequence of solving the theory. A background-independent approach to quantum gravity must make a genuine prediction about the symmetry of space and time.

As I discussed earlier, if the world had two dimensions of space, we know the answer. There is no freedom; the calculations show that particles behave according to DSR. Might the same be true in the real world, with three dimensions of space? My intuition is that it would, and we have results in loop quantum gravity that provide evidence, but not yet proof, for this idea. My fondest hope is that this question can be settled quickly, before the observations tell us what is true. It would be wonderful to get a real prediction out of a quantum theory of gravity and then have it shown to be false by an unambiguous observation. The only thing better would be if experiment confirmed the prediction. Either way, we would be doing real science.”

Dear Anonymous, 1) I am not sure re LQG. I and others have tried to convey the difference between the following two statements: “There are papers that show that, were certain additional assumptions to hold, LQG could produce predictions for modified dispersion relations.” And “LQG, from first principles, makes those predictions.” The first is true, the second is not. As we see in the EDGE interview, this is not always easy, but I don’t think that experts have ever been confused about this.

For example, in a review paper written around the time of the EDGE interview, hep-th/0303185, I include in a list of open problems for LQG, “Refine the existing calculations that predict modified energy-momentum relations, to determine whether or not the theory makes unique predictions for the parameters α, β in the modified energy-momentum relations (2), for the different particle species. Resolve the question of whether Lorentz invariance is realized exactly, broken or realized non-linearly in the low energy limit of loop quantum gravity.”

Regarding string theory, so far almost all string backgrounds are Lorentz invariant and so are the predictions. As I indicated it is unknown whether there can be any consistent interacting string theories with deformed dispersion relations.

2) I don’t know, this is a key question and I and others have been working on it. It is not an easy problem, certainly it is harder than I expected initially. I’m considering writing a paper on why it is such a hard problem, but I’d rather solve it. My view is that it would be better for science if a theory makes a definite prediction which is ruled, rather than no prediction at all.

3) The basic idea is that there would be stochastic predictions coming from quantum fluctuations at the Planck scale. This is not something I myself have worked on, so I don’t think I can say more than is there without going into a lot of technicalities.

Thank you for your reply. You write "I don’t think that experts have ever been confused about this." I do understand now the distinction you make, but as a member of the public I wish to say that I wish it had been made previously. Bee writes above the information has certainly be available, and you write it has been evident to experts, but more popular sources had left me with a different impression though I do not mean to blame this on you personally. It is disapointing to learn even these new observations are inconclusive for the reasons you explain, though I remain optimistic that the problem you address in 2) will be resolved.

it seems to me that the statement(s) about LQG and DSR that you make now "My wording in the transcript of the EDGE interview is indeed imprecise and I am sorry about it"

are quite different from statements you made earlier, e.g. in a talk 2006 about LQG (see www.physics.utoronto.ca/~colloq/Talk2006_smolin/smolin-oct06.pdf)

"indications of novel and testable O(lp) effects including deformation of Poincare symmetry leading to an energy dependent speed of light. This is shown precisely in 2+1 but only semiclassically in 3+1"

Would you agree that it seems that you somehow changed your mind recently?

To my eyes, Lee pretty explicitly makes a "falsifiable prediction" in the abstract of hep-th/0501091:

"Quantum gravity is studied in a semiclassical approximation and it is found that to first order in the Planck length the effect of quantum gravity is to make the low energy effective spacetime metric energy dependent. The diffeomorphism invariance of the semiclassical theory forbids the appearance of a preferred frame of reference, consequently the local symmetry of this energy-dependent effective metric is a non-linear realization of the Lorentz transformations, which renders the Planck energy observer independent. This gives a form of deformed or doubly special relativity (DSR), previously explored with Magueijo, called the rainbow metric. The general argument determines the sign, but not the exact coefficient of the effect. But it applies in all dimensions with and without supersymmetry, and is, at least to leading order, universal for all matter couplings.A consequence of DSR realized with an energy dependent effective metric is a helicity independent energy dependence in the speed of light to first order in the Planck length. However, thresholds for Tev photons and GZK protons are unchanged from special relativistic predictions. These predictions of quantum gravity are falsifiable by the upcoming AUGER and GLAST experiments."

/*..what its advantage over string theory...*/People aren't doing things, because they're rational or even useful - but because they can be done and someone is willing to pay it (which is usual, but not necessary condition).

Al: I have no idea what your problem is either. The predictions in the paper you mention are based on some assumptions that are explicitly mentioned. For all I can tell it is nowhere even remotely claimed this is a prediction of LQG. If it turns out a first order Planck-scale modification of dispersion relation can be ruled out (I don't think the here discussed observation even remotely does that) then there are several models that can be thrown in the trashbin, and at least one of the assumptions in the paper you mentioned must be wrong, but that's a completely different question than whether one of the currently pursued approaches towards QG can be ruled out.

First, I would really appreciate if you anonymous guys could do as much as enumerate yourselves. Please chose the option Name/URL below the comment window. It will open a drop-box where you can enter a name (or a number if you're uninspired), you don't have to enter a URL. I also want to point out that criticizing others without signing your name is nothing but a sign of cowardice and something I generally disapprove of.

Second: It is quite comical how you and several other people in this comment section try to find somebody to blame for your own misunderstanding. If you are stupid enough to believe what you read in the popular press (or on blogs possibly) point the finger to your own nose. Statements by anonymous posters without reference according to which "LQG champions" allegedly have claimed something for several years are nothing but irrelevant noise.

Bee: When I see a paper titled "Falsifiable predictions from semiclassical quantum gravity", and then read in the abstract the sentence "These predictions of quantum gravity are falsifiable by the upcoming AUGER and GLAST experiments", with no mention of any caveats or assumptions, I naturally tend to think that a robust prediction is being made. One can of course look into the work further and discover that it all rests on a foundation of sand, but wouldn't it be nice if the author could just give me the straight dope up front. In my opinion, this is no way to present research.

Al: a) It is generally advisable to read a paper before jumping to conclusions from the abstract. I don't want to discuss other people's style to present their research. b) No conclusion is ever better than the assumptions that lead to it.

As far as I can tell if a quantum gravity theory is something to be considered as being attainable, it must constitute having repercussions for general relativity at some high enough energy, no matter if the theory be labelled string, loop or whatever. The thing is no one can claim as having a full blown quantum gravity theory that renders definitive predictions, yet only models of such theories representing as being works in progress. One can’t compare these prototheories with any of the long established ones, which do have elements that are definitively falsifiable as it relates to our world.

Perhaps it’s not a bad thing for experiment to be ahead of theory in this case, for it will narrow further what the parameters of such a theory can be if and when it can be made predictive. For that matter the standard model could be argued as being a theory that to a significant degree is itself directed and dictated by experiment, instead of the other way around, with many of its parameters being mandated by observation, rather than a direct consequence of theory. If a QGT is ever achieved it may be no better in this regard or perhaps even worse.

Of course this was not the initial goal, for it was hoped to give reason for many of what are called the free parameters and much further. So rather than a TOE, we may end up with a TOA, that is instead of a theory of everything, only having a theory of anything, whether it describes our specific world or any other you might choose to imagine . As far as I can gather Smolin is betting that such a discovery will prove able to extend our practical knowledge of the world and I for one am wishing him to be correct in this, regardless of what the theory might end up being. That is I rather see theoretical physics end after having us know all which is needed to explain the basis of our world, rather then it finished resultant of it being rendered as unrevealing and thus a dead end.

Bee: Since I do not have the time to read every paper on the arxiv, I appreciate it when authors give an accurate summary of their work in the abstract. I think this example fails on that account. Either you are too polite too agree, which is understandable, or you really see no problem with this kind of presentation, which I would find alarming.

To be honest, even reading past the abstract of this paper, I still get the definite impression that a fairly strong prediction is being made here. Lee gets credit for going out on a limb, but I don't think it's unreasonable for him to get called when the prediction is not verified.

Okay, now everybody agrees that LQG makes no predictions. If we add to it the fact that it is unlikely to be consistent, we get the perfect state of affairs in which one abandones this directions of research. RIP.

It appears that for many represented here the sum total of all knowledge should be able to be represented as a collected series of headlines and punch lines. As I recall this is something we discussed in one of your previous posts , where I mentioned those like Ray Bradbury have long warned of such changes in comprehension relative to attention span. I would be saddened to discover because of such a self inflicted limitation that arxiv, might be forced to switch to a format with limits reflectant of this, such as twitter :-)

I sympathize and most of us try, when communicating with the public to be precise. It is not always easy in the midst of research to say things that are both stripped of myriad qualifications and caveats that are part of our normal communication with experts and completely precise. My book, quoted above, was written for the general public and made precisely the distinction in question.

Dear Wolfgang,

You seem to be quoting out of context. Just after the line you quote the transparency says:“Thus DSR is a consequence in 2+1 dimensions. Is this also true in 3+1 dimensions? So far there is a semi-classical argument, but no proof.”

Dear Al,

The abstract says precisely what the paper does. I don’t see that there is anything misleading at all. The assumptions the calculation is based on are clearly stated in the paper. If the resulting prediction is falsified, then so are at least one of those assumptions. I am happy to stand by that.

There is no claim there of a derivation from LQG, indeed LQG is not mentioned in the abstract or the introduction of the paper. Instead a new method is introduced for studying any quantum theory of gravity that can be described by certain assumptions, in a semiclassical approximation. The word semi-classical refers to an expansion around a classical background, so these are quite disjoint from the background independent techniques used in the core of LQG.

Dear Anonomous2,

I have to disagree that it is a "simple fact" that, "For many years LQG champions have claimed that their theory makes predictions."

There are a number of standard textbooks for LQG, written by Ashtekar, Gambini and Pullin, Rovelli, and Thiemann. In none of these is the claim made that there are definite predictions for modified dispersion relations. Nor is this claim made in the standard review articles. In my own review quoted above, I made it clear this is an open problem. As we just saw in the transparencies of talks I've given it is also clearly stated as an open problem.

Lee: If you were being honest you wouldn't have written "these predictions of quantum gravity..." but rather made clear that your predictions were highly model/assumption dependent. There is no mention of models or assumptions in the abstract, other than the semi-classical approximation, which has a standard meaning to most readers. Turning it around, any reader would think that the failure to find these effects would mean that semi-classical quantum gravity has been falsified. Are you ready to stand behind that? A truly generic prediction of quantum gravity would be very exciting. For this reason I spent some time on studying this paper, and frankly felt cheated once I discovered that there were no real "predictions of quantum gravity" being made.

I am being honest. The assumptions of this paper are not highly model dependent but they are assumptions and while general one of more of them may be wrong.

If you want to continue the discussion we have to get down to technicalities as we have to state the precise assumptions and discuss what we think of them. I am happy to do that either here or otherwise,if you are, but I have two requests: 1) withdraw and refrain from your antagonistic language such as "If you were being honest..." and 2) be honest enough with me to tell me who you really are and what you know.

Kea: Thank you so very much for your valuable and insightful comments. It is not so much that you are rude, but your contribution is completely superfluous. If we are so boring, why don't you just go elsewhere.

Yes, I also found this discussion here to remind me of many things we discussed previously, for example how the inability of the media to deal with scientific uncertainty easily creates inaccuracy (see Dealing with Uncertainty). I also previously commented in Fact of Fiction on that whoever writes for an audience has a responsibility to provide accurate information (the more readers, the more responsibility). Unfortunately, entertainment and accuracy clash frequently, and often entertainment wins on the expenses of accuracy. Unfortunately, it's a lose-lose situation for scientists. Whether or not your statement is eventually confirmed, you'll finally have to explain you can't draw the conclusions you claimed because the qualifiers were missing. I understand that it can be difficult to find the right balance, but these days the balance is definitely off towards entertainment. It's a game scientists shouldn't engage in, even if it makes us seem dry and dull. I stand to being dry and dull ;-) Best,

Al: "Either you are too polite too agree... or you really see no problem..."

This has nothing to do with politeness, I'm simply not interested in hearing whether you like somebody's writing style, and this isn't a writers seminar. You have misunderstood the abstract of a paper, commented on it without reading the paper, and now you're trying to blame the author's writing style for your resulting misunderstanding? Grow up.

Perhaps it might be helpful if someone were to explain which of the assumptions, that went into hep-th/0501091, is/are falsified by the Fermi observation under discussion. It would then be clear what (if any) implications there are for the wider program of LQG.

In particular, are there assumptions in hep-th/0501091 that go beyond those of LQG (and which, hence, could be jettisoned, without affecting the viability of LQG)?

on bounds to the gravitational-wave spectrum are of direct relevance for quantum gravity (specially LQG) as a probe for pre Big Bang nucleosynthesis epoch. If that is the case, what are the expectations /predictions (if any) for the spectrum?

Dear Jacques, Yes, there are several such assumptions. One is the scaling assumption I have already mentioned in the above posts. In case you want to understand this in detail, the following is a detailed, and technical summary of the paper.

The argument of hep-th/0501091 is in three stages. The first, in section II depends on four numbered assumptions, which are given in section I.

1) The configuration space is the space of a connection, A,

2) the action is of the form given in eq. (1) and the Poisson brackets are of the form of eq. (2).

3) Diffeomorphism invariance.

These are all true as mentioned for classical GR and supergravity in all dimensions d=2+1 and above.

4) We choose to work in the connection representation, where states are functionals of A and where the densitized frame field is represented by (3). This is where we depart from the classical theory and of course the choice of representation matters.

The connection rep used here is an heuristic step to the spin network representation, but it is not equivalent to it. The spin net rep, and hence LQG, is based on a particular choice of measure and inner product on the function space, the Ashtekar-Lewandowski measure, which is not assumed here.

We also assume that in the semiclassical approximation the wave functional has the form of the exponential of a solution to the Hamilton-Jacobi function (eq. 4). At present we do not know any state of this form which is normalizable in the Ashtekar-Lewandowski measure, so we do not know whether or not this assumption is satisfied in LQG.

Based on these assumptions we derive a relation (17) for the action of the densitized triad on a semiclassical state including matter, which leads to the modified dispersion relation (22). The key quantity is the constant alpha, which is determined by constants in (16). Of these rho and M will for many cases be the planck units. Alpha also depends on the undetermined constant mu, from (8) that arises from the translation of the time functional on the configuration space to a time coordinate on a given space-time geometry.

Since alpha is undetermined we only get a functional form of a modification, but not an exact coefficient, as mentioned in the abstract.

Daniel: You are mentioning an important issue that I have pointed out in earlier posts, which is that there can very well be astrophysical effects at the source and during propagation that play a role and that have to be cleanly identified in order to draw any conclusions about other effects. I don't know enough about the topic to tell off hand whether what you say could explain the observations, but it's not clear to me why such an effect would increase gradually with energy, nor why it would lead to a delay rather than a decrease in luminosity. I would expect an absorbtion in some frequency range.

In section IV I use a more detailed argument to try to tie down the value of alpha and its dependence on fundamental constants. This is the second stage.

The strategy is inspired by the argument from quantum group theory reviewed in section V, and so defines the theory with a fixed Lambda, (CC), and then takes the limit of Lambda to zero. This means that we must, as in the group theory argument, introduce scalings of certain operators by powers of the ratio of the planck length to the radius of the universe (that is uv over ir cutoffs.)

In the argument of this section this becomes the scaling needed to turn the matter term in the Hamiltonian constraint into the matter Hamiltonian, when we take the limit of Lambda to zero simultaneously with the semiclassical limit. This defines the matter theory on a fixed flat spacetime. This scaling is measured by a function Z defined by (32), an ansatz for which is given by (33). Alpha is then determined by (38), which, when we put in the simplest case, d=3, and canonical scaling for the cosmological constant, r=0 implies that there is a finite deformation when n=1.

This is the additional scaling assumption mentioned in the above postings. To confirm it, we need to compute Z, to do this we need information from the full quantum field theory. If a given theory yields, n=1, the resulting alpha is order unity we get an effect of linear order in the Planck scale.

In section VI, the third stage, we study a particular semi-classical state, which is the Kodama state, in order to illustrate how the general argument works in detail in a particular case. But it turns out we still need a microscopic calculation to compute alpha.

Note that the Kodama state is controversial when used as an exact state microscopic state, but here it is playing a role only as a semiclassical state, which follows from the fact that the Chern-Simons invariant satisfies the Hamilton-Jacobi function. In any case, nothing in the first two stages of the argument depend on the use of the Kodama state

As I said above,the most vulnerable assumption of the argument is the scaling assumption. I have tried various strategies to derive Z but so far none have worked. As we have at present no information as to how Z should scale in LQG, we have no indication of what the prediction from LQG is.

Now to your first question: given that even with the scaling assumption we only get to the conclusion that alpha is order unity, I would say that the whole argument is falsified if alpha is bounded far below unity. The recent observations, interpreted conservatively with regard to the astrophysics, bound it to be less than around unity, which is intriguing but is not conclusive. If future observations or improvements in the understanding of the astrophysics of the sources allows alpha to be bounded to be less than 1/100, which we see from the Fermi collaboration paper is very much within the realm of possibility, I would say the whole argument has been falsified. But to know if LQG or any particular theory has also been affected we need to have a calculation of alpha within that theory.

if you see the complement of the paper, you will see a graph that shows the average delay versus photon energy. (Figure 2). You see that above a few hundreds of Kev, 2MeV - 20MeV there is a fast growing curve, which right after becomes more or less flat. This is the energy of the compton scattering energy for most of the materials.

At higher energies, up to 500GeV, there is interaction with the background light. The speed of light is effectively reduced because of the photon-photon scattering.

http://en.wikipedia.org/wiki/Gamma_ray

Anyway, even at all this distance, any delay for high energy neutrinos would be smaller, given its tiny mass, even if only the photon had a modified dispertion speed with significative first order plack scale modification.

It's better than to analyze the arrival time for high energy neutrinos, not photons.

Wow, I can see this topic is controversial and involves some confusion about meanings and implications. Well - we would indeed expect (me, in my middle-brow way on this) that if there is a relation between QM and gravity/space-time structure, for ST to be foamy and jittery at around the Planck scale, right? And if it was, it is hard to see how high-energy photons (or other entity with such short wave length) could fail to be scattered and/or slowed by the foam - per Lee's description and my own intuition (as I thought long ago when looking at the diagrams.)

Yet if the experiments say "no" - if, since not enough compiled evidence yet? - then what? What is space doing to evade this effect? I don't see how it could fail to be grainy. String fans say they don't have this problem, but I haven't seen good explanations of their picture of "empty space-time" instead of just particles.

Also, commenters seem to have forgotten the issue about energy density versus energy of any one photon. So then, the issue of these effects is still open, despite the insinuation from the recent experimenters and their fans (esp. militants like Lumo) that this singular result is near-fatal for DSR type theories. Anyone?

OK, interesting re photon-photon interaction, and thanks re ST later. I hadn't seen PPI worked into the assessment of the topical result. Doesn't that complicate things even more? Would it effectively slow photons (even the ones that were not redirected per se), or just give them a chance of scattering?

The model I was alluding to has nothing to do with either string theory or photon-photon scattering.It relates to the Cantorian topology of space-time at large energy scales and it complies with the Lorentz symmetry.Here is the reference, if you are interested:

The model I was alluding to has nothing to do with either string theory or photon-photon interaction.It relates to the Cantorian topology of space-time at large energy scales and it complies with Lorentz symmetry.Here is the reference, if you are interested:

Thanks Ervin. I thought you meant "string theory" when you said ST, I guess you meant space-time. Maybe "S-T" better for the latter? Also, PPI seems relevant in any case, regardless of being featured in any specific theory.PS I want to friend your FB page if OK, I am already on Bee's!

Daniel: For all I can see the reason why the curve becomes "more or less flat" is that it's a log plot. The assumption here is that the delay grows to first order linearly with the energy over Planckmass, I can't see how you would do that neither with interstellar gas nor photon scattering (the energies we're talking about are some orders of magnitude below a TeV), neither do I understand why and how the Planck scale should come into play there. However, let me repeat again that you are right in that one carefully has to identify astrophysical effects before one can draw conclusions. Modified dispersion relations are common in-medium effects which is essentially the reason why they aren't difficult to obtain within all kinds of models.

If you don't mind my suggestion, I urge closing this post soon and reopening the topic later. It would be helpful to let things cool off a bit. There are a lot of partisan feelings involved and, perhaps, too many confusing concepts thrown into the mix. We all need the distance to reflect and gauge what this latest finding is really telling us.

As I recall, it was once proposed by Hawkings that primordial black holes could be a source of some of the gamma ray bursts. This has me to wonder if this is still considered a possibility or has been ruled out. This has me also question as to how it is determined what the distance is and the origin of source these bursts represent as being. I guess what I’m asking is what differences would be in the photon spectra and duration from those received from a star gone supernova and that of the final disintegration of a primordial black hole? It seems to me that both the spectra and duration would be markedly different and for primordial black holes although relatively close the total number of quanta received be fairly limited.

Daniel: Sorry, I was looking at the figure in the paper, not in the supplement (and in addition Fig 1, not 2). You're right. To be honest though, I don't find the plot too striking. That curve might, or might not remain constant. In any case, the problem with the time lag is the same as in the main paper. They've made an assumption about where to expect the emission, in this case it seems they've tried to trace three peaks in the spectrum. If the higher energetic photons are eg simply not emitted in the first peak or just don't follow the peak structure, that figure is simply guesswork.

You know, all this discussion about delayed photons made me realize that one cannot use photons from distant regions of space, but neutrinos instead, neither consider that the first order correction in Mpl/Mqg is linear, because it would mean that plank effects on photons would be stronger than the intergalactic background and dust noise.

If the first order correction is quadratic, it could be a clue that DSR follows a hyperbolic law, or even a sigmoid law, in which there was a upper threshold for photon energy, in the sense that the plank scale would be saturated with energy.

Daniel: Before you claim photons cannot be used, would you please provide any reference that addresses the points I mentioned above a) what is the energy dependence of photon-scattering on interstellar dust b) what is the effect on the luminosity c) why would the Planck scale play any role in this effect?

I think the idea that gamma ray bursts originate from primordial black holes is pretty much dead since it's been discovered that the origin of GRBs is within faraway galaxies and no other evidence for primordial black holes has been found. See, if the GRBs were caused by PMBs, that would be because these PMBs just happen to completely evaporate today, which is in a very narrow mass regime (mass at formation). One would then expect there to be also plenty of them somewhat heavier that should be around (plus the ones that previously evaporated might have left an imprint in the CMB). In addition I would think if it was PMBs evaporating the event would be very regular which GRBs aren't.

Let me begin with c). Any Lorentz breaking symmetry with a linear dependence on Mpl/Mpg was ruled out. A quadratic dependence is insanely small, by more than 30 orders of magnitude if you try to measure it by means of long distance GRB.

With 30 orders of magnitude, there is a large array to look for systematic errors, so, it would be more practical if one just looked for a ultra high energy source and that is free of noise. And given that the precedent that photons interact with so many things, it is better to deal with neutrinos.

Daniel: The purpose of this post was to say that the analysis presented in the paper does not rule out even the linear case (whether or not I think such a modification is plausible or not). It only does rule it out if you buy into one specific of the author's guesses for the emission time of one particular photon, what sort of evidence is that? Just look at table 2 and see that the bounds with "conservative" guesses are pretty close to the Planck scale.

The possibility of competing effects from scattering of photons of off dust or gas was studied in detail by Bombelli and Winkler, among others, in gr-qc/0403049, and was ruled out as a competitor for linear effects in planck scale dispersion.

The possibility of using very high energy neutrinos to study planck scale dispersion has been studied before, most recently in the paper by Amelino=Camelia and myself, referenced in my first post above.

IN Susskind's thought experiments he was to suggest as we equate we need to understand that a "calibration point" was needed in terms of discernment of photon recognition of blackhole origins and beyond?

Yet, how is it that we know where gravitational inclinations are first if we did not pursure time closer to the information that the "elephant represents" inside and outside the blackhole?

Of course, elephant is allegorical to "quantum gravity," as well as too, "complementary measure?"

There are certainly DSR dispersion relations that start linear and then at higher order saturate an upper limit to energy or to momentum, or both. The literature has examples of all three. So the issue of saturating an upper limit is not related to whether the leading order is linear or quadratic.

Some argue that quadratic is more natural for lorentz symmetry breaking as the linear order for photons requires breaking parity.

In any case, the important thing is that at this point theory gives a menu of options, with no absolutely compelling case for any of them, so it is a good thing we now have an experimental window on this.

Quantum interrogation is interesting.:)Fuzzy logic in complementary position leave one with an inexactitude feeling, yet it seems particular to coloring of the attributes of definition of topological movement?:)

This is a key question which was covered in many places, for example my paper with Amelino-Camelia or my post above. See those for details.

The linear correction is parity odd in the case of Lorentz symmetry breaking and parity even in the case of DSR. The former is ruled out by several orders of magnitude because it leads to rotations of planes of polarization, see the reference by Gleiser and Kozemeh I mentioned above. The latter, DSR, parity even case is not ruled out, although further observations by Fermi may be able to do that.

The distinction between broken and deformed lorentz invariance is a key point in this whole discussion.

One Photon two photon, and then no such thing as entanglement? Makes it hard to destroy foundational thinking when there is a whole history behind it.

It's evolution allows one to see in other realms, that we were not able to see in this one, and yet prepares the mind for movement in those other realms? You had to have toposense in order to use the colorimetric, and in order to do that, you need to consume it fully?:)

On the other hand you need to be aware of the maybe most bizarre aspect of quantum mechanics: the fact that our world is only apparently tridimensional, in reality it is formed from two tridimensional worlds that we see superimposed onto each other: everywhere we think we see one point in fact there are two points. An elementary particle (such as an electron) when it rotates around its axis is passes from one part of the world into the other part, as if it would climb on a spiral; when it rotates furthermore it goes back as if the spiral would descend back into the first part of world. ( bold added for emphasis by me)See:Superfluidity makes headlines once again Sorry, I know old news.

Re a single photon, I'm reminded of Blas Cabrera's famous monopole search experiment at Stanford. For those who were born too late to hear of it, he saw a perfect monopole candidate but it's never been verified with much larger data sets.

This is not to say that I think that the Fermi data is bad, just that every now and then nature does pull off a stunt. My guess is that the speed of light does not depend much on frequency.

Thanks for clearing up my wonderings about perhaps primordial black holes being overlooked. It’s just that this burst is described as a short duration one and it seems they have no solid explanation of what in the way of source distinguishes the long ones from short. Of course I suppose if this had any chance of being a PBH burst, both Hawking and the string theorists would be all over it by now. That is for Hawking it could mean the Nobel Prize and reinforce the existence of a mechanism that allows for even very low mass PBH’s to have survived this long since their being part of many versions of string theory.

I also thank you for that piece on examining the afterglow of the gamma ray burst in determining their red shift, as it cleared up much for me in such regard. The thing I still wonder about is if the spectra of a primordial black hole disinagrating would be the same as a super nova explosion and has such a calculation ever been made. It appears to me that this all to be tied into this information loss issue, as to what still remains in a black hole until its demise.

It could be argued that a PBH’s spectra could be tied into such an observation of its spectra of afterglow. Two things that would seem to be necessary elements of any PBH disinigration would have it be a very short burst and to expect emission of some quite high energy photons. The thing with this burst, although the red shift is evident (and suggests that it's extra galatic) it’s not all that great having measured at z = 0.903 when compared to the longest thus recorded a z=8.26, being one thought originating at near the beginnings of the universe. Anyway, I find this all very interesting and perhaps an indication that we might need to construct even more sensitive instruments with a greater collection base.

I recently was reading the August 15 edition of New Scientist and the main article was about QG. Now, that article indicated recent experiments did find a time lag in high energy photons from gamma-ray bursts.

How does that compare/contrast with the FERMI experiment and what are the implications for quantum gravity? ( OK, it's two questions :)

/*.. I don't know what Lubos' problem is...*/It's not so difficult to understand. The problem is, Lee Smolin presented gamma ray dispersion as a way for falsification of LQG and now it denies it.

http://www.aetherwavetheory.info/images/physics/lqg/smolin_lqg.gif

The picture with description bellow is a snapshot of article in Scientific American, 59: "Atoms of space and time", which is definitelly worth reading by itself - but it contains desinformation, concerning LQG predictions.