OPERA has just found that either neutrinos travel 0.03% faster than photons we've measured, or their equipment has an unknown systematic error. Assuming there's no equipment error, I would find it more palatable to assume that light around Earth travels a bit below c and that neutrinos travel closer to c. What we think of as vacuum could really be a medium with refractive index 1.0003, perhaps due to a uniform background of weakly-interacting particles (maybe even dark matter) that affect photons but not neutrinos.

I have a physics undergrad degree; if there's someone here with better qualifications, would you care to weigh in on the idea that c could be 0.03% faster than the speed of light we measure on Earth?

It's an interesting idea, but quite unlikely... Remember that the speed of light is (supposedly!) an absolute, somewhat like absolute zero, and thing tend to approach it asymptotically. One can therefore tend to see where exactly the asymptote lies, and we'd quite likely notice the difference. For example, particles in the LHC travel at c - 0.0000009% and have the corresponding properties as predicted by relativity. If they were, in fact, traveling at c - 0.03% our calculations should be / are off by over 3 orders of magnitude (gamma 7500 vs 4).

In short, that much error in c would pretty much wreck relativity anyways.

With the caveat that I don't really have better qualifications than you:).

It's not so simple. We've measured the speed of light to great precision. We know what that speed is, and we know photons are massless, so we know with very high confidence what the speed of massless particles is. If neutrinos travel faster than light, then this is very surprising and points to something new and interesting. I'm avoiding referring to 'c' because it would be ambiguous: in traditional relativity, the constant speed of light is equal to the maximum possible speed, which is also in essence the ratio between space-like and time-like variables in the theory (the slope of light-cones and all that). It's a constant that reappears over and over again, and marvelously it's precisely equal to the speed of light. It can't be as simple as just "we were wrong, c is a bit higher than we thought" because it would immediately mean that "c" isn't as universal as we thought: the symmetry of the universe must be somehow different so that photons and neutrinos (and probably other particles) follow slightly different rules.

But if this result is indeed true, and neutrinos travel faster than light, then this is truly amazing and could mean different things. One possibility is that different particles actually have different 'speed limits' (and different causal cones), so there is c_light, c_neutrinos, etc. There are many other possibilities (extra dimensions, breaking of Lorentz invariance, imaginary mass, closed timelike curves, etc.). All of them amount to a substantial rethinking to some aspect of physics. This is definitely exciting, since it could be telling us something very new! And it won't be as simple as just adjusting a constant a bit. (If we tweak the value of "c" in our equations even just a bit, all kinds of well-tested observations, in everything from cosmology to the functioning of transistors, would come out wrong...).

Lastly, it's worth keeping in mind that it's probably a subtle experimental error (very subtle!). This is still useful, because it will teach us something new about experiment design and possibly even teach us something about particle physics. For instance, the timing calculation is based on certain models of the packet of neutrinos that are generated. But, it could be that the packet that arrives at the end is slightly different than the one sent out at the beginning, thus altering the way one should compute the flight time. This could point to some interesting, previously unknown, ways in which neutrinos are generated, or interact with matter, or interact with each other. In any case it will be interesting.

What we think of as vacuum could really be a medium with refractive index 1.0003

Ahh, the old subatomic ether thing. Look up michelson-morley interferometer experiment that lead to all that relativity stuff... At 300 ppm, that effect, if it existed, would prevent most interesting interferometer technology from existing. No FFT-IR spectroscopy, most inertial navigation systems would be too drifty to use, astrophysicists would not be able to do the interferometer thing using multiple scopes...

The other problem is we've verified E=mc2 and time dilation to much better than 300 ppm both of which depend on c.

Also, its expensive, and a bit beyond my basement, but your average RF engineer can build stuff to better than 300 ppm on first principles.

Then you start offending the chemists. I have to think about it a bit, but wouldn't this screw up quite a bit of chemistry (and physics) related to ferromagnetic materials? And the NMR scanners wouldn't work right, or at least how they work would depend on the phase of the moon, from memory 300 ppm is a pretty huge shift.

You say that because you're probably not intimately familiar with just *how* well established General Relativity is.

It's a theory which has survived decades of absurdly rigorous testing. Being cautious in how you present it is absolutely the correct approach - and far more responsible then how say, the debacle over cold fusion [wikipedia.org] was handled.

These are not trivial measurements to make, nor is there any obvious explanatory theory that they confirm. They also aren't a gross excess - well bounded, but a very small difference which is on the same timescale as the delays in the processing speeds of the individual components of the apparatus. It's only us sci-fi nerds who fully expect (want) FTL to be possible and Relativity broken somehow.