Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

You don't need to include all the wonderful interesting new ideas that are unproven and untested. The neutrinos arrived 60 nanoseconds faster than expected, with an error margin of 10 nanoseconds. So, really, they just have to have miscalculated the distance between the generation point and the detector by 60 feet. (Distance done in feet, due to the extremely convenient "light speed is approximately one foot per nanosecond".) They've already admitted that they could have had measurement errors in the same o

Confirmation of the results of an experiment by an independent party is standard practice in the scientific community. Without it, the findings wouldn't even be considered completely valid! Nothing to see here...

From TFA: “We should have a result in 4-6 months as the data is already taken. We just have to measure some of our delays more carefully,” - Jenny Thomas.

MINOS was already repeating their measurements, but CERN got the jump on them. It's anyone's guess too whether there was a back channel of information from OPERA to MINOS that might have tipped them off and encouraged them to start taking data early. With so many people involved, you almost have to assume that preliminary findings migrate across the Atlantic pretty quickly.

Of course, once they realized that some fool of an intern ordered a Denon AKDL1 link cable [amazon.com] (see first review)-- which of course unleashes all sorts of problematic physics-- everything became clear.

Once they replaced it with a link cable from best buy, the results were as expected.

The scientists at CERN asked for peer review and checking of their methodology. This announcement means that at least on paper the method was near-perfect for Fermilab to be committing resources in the near future to prove/disprove it.

People at OPERA project are having an exemplar scientific conduct. It is amazing how they actively refused to elaborate any theories on their findings, restating their job as unbiased experimental physicists. This is pure scientific method.

I somewhat disagree. Their results met the criteria of scientific discovery and they (well, I certainly hope!) reviewed their process for any error. So even though they literally, by scientific standard, discovered FTL particles, they explicitly state that they don't actually think they did because it disagrees with existing theories. This is *biased* experimental physics.Yes, relativity has a good track record, and they likely missed something. OTOH, neutrinos are still a pretty new research topic and maybe relativity doesn't cover all the universe has to offer. I do think that these results should be retested, verified, and studied as much as possible. But I'm also seriously disappointed that an ostensibly legitimate discovery has to be presented as 'we screwed up but we don't know why so look at these' in order to avoid raeg from close minded scientists.

You say that because you're probably not intimately familiar with just *how* well established General Relativity is.

It's a theory which has survived decades of absurdly rigorous testing. Being cautious in how you present it is absolutely the correct approach - and far more responsible then how say, the debacle over cold fusion [wikipedia.org] was handled.

These are not trivial measurements to make, nor is there any obvious explanatory theory that they confirm. They also aren't a gross excess - well bounded, but a very small difference which is on the same timescale as the delays in the processing speeds of the individual components of the apparatus. It's only us sci-fi nerds who fully expect (want) FTL to be possible and Relativity broken somehow.

And yet General Relativity isn't even as well established a theory as Newtonian mechanics was (which had a century of observational evidence backing it up), or for that matter geocentric theory, which had millennia of observations backing it up (every scientists before and during Galileo's time believe the Earth was stationary, except for a very very tiny handful. It was actually the scientists, not religion, that rejected Galileo's theory when he first presented it.)

Both of them were overturned by more careful observations, in ways and of things we couldn't or hadn't observed before. We already know General Relativity has issues (specifically, with quantum mechanics), and while its predictions fit well with our observations so far, it hasn't actually been proven definitively. It is entirely possible that it is very accurate, but not precisely true. In fact, judging from the history of scientific theory, that is by far the most likely possibility.

New science is nearly always happens when scientists find something they don't expect. These observations may be an error, or they may be the beginning of the discovery of an entirely new theory that explains General Relativity even better, just as Relativity explained Newtonian physics better.

Your metric for establishedness of a theory is flawed because it fails to take into account the rise in population (and even greater rise in scientific output per capita). GR could be considered to be more established than Newtonian physics, because, even though it's been the main paradigm for a shorter time, in this time it was recognised (and not disproven) by a larger number of people.

I'm quite familiar with how well tested GR is. I'm, not, however, aware of any of those tests involving neutrinos. Besides, we already know we don't know how GR relates to quantum mechanics. Maybe this is just an example of that on a macroscopic scale. The point being is that GR is extremely robust in those areas where it's been heavily verified, but that robustness doesn't automatically translate into robustness in other areas. Before we had high energies / high precision Newtonian mechanics were also

So even though they literally, by scientific standard, discovered FTL particles, they explicitly state that they don't actually think they did because it disagrees with existing theories. This is *biased* experimental physics.

If relativity is broken, much of modern physics falls apart. Not only that, but we have measured neutrino velocity before to within one part in a few million and they weren't FTL.

So given that, any sensible scientist will say 'here are our results, surely there's something wrong but we can't find it', and I think we can be almost certain that there is indeed something wrong in the measurements. We'll know sooner or later.

No. We already know how to generate and detect neutrinos at will. If they travel FTL, then that means we know how to send messages faster than light = backwards in time. This means we can break causality at will. That is a hell of a lot more than a footnote, it would completely upset our entire understanding of the universe.

I think you're missing a step there:1. IF we can modulate at will sources of neutrinos which travel faster than light2. AND IF the interpretation of Special Relativity is true that claims that FTL speeds equal motion backwards in time3. THEN we can transmit information backwards in time.

Further,4. IF we can transmit information backwards in time,5. AND IF transmitting information backwards in time allows us to reverse the choice to send that information backwards in time,6. THEN AND ONLY THEN do we have to worry about causality violation problems.

It's an interesting problem because there's a number of assumptions in this chain of reasoning.

First, I know it's taken as an axiom by physicists that "FTL equals backwards in time because relativity says so", but I'm not sure why we should believe, a priori, that this is in fact the case. We're talking about interpretations of relativity, not the core guts of it - the Lorentz contraction, which is the observable part. Certainly if (1) were true and it turned out that we didn't get (4), then it would seem obvious that (2) is not in fact true. This wouldn't invalidate most of the predictions of Special Relativity, not its usefulness as a rule-of-thumb calculation tool, but it would invalidate the strict interpretation that nothing can ever ever ever go faster than light. It would just turn out that the Lorentz contraction is a dynamical, not a kinematic, effect - something which is generally true about large numbers of ordinary particles, but doesn't have to be the case for a few exceptions.

The general trend in high energy physics has been to see high-level "laws" as emerging from lower levels of reality which obey very different laws, and Einstein's wider relativity program for a Unified Field Theory never managed to describe the quantum world correctly. Why then should we assume that SR is exactly correct, and not just mostly correct? Einstein was smart enough to spot the problem back when he wrote the EPR paper; he believed in a fully real (ie non observer-dependent) world with hidden variables that couldn't send information to, say, update quantum correlations faster than light. Bell's Inequality proves that both of those beliefs can't be correct. We either have to throw away realism, throw away causality, or we have to throw away a hard lightspeed limit. Occam's Razor suggests that it would be a lot simpler to throw away the lightspeed limit than to throw away causality or realism, but ymmv I guess.

Abandoning a strict interpretation of Special Relativity as describing how time and space "really" behave doesn't mean abandoning all the observations built on it. For example, Oleg Jefimenko [wikipedia.org] has constructed equations which model the Lorentz contraction as a dynamical effect resulting from retarded electromagnetic emissions. The equations are a little harder to work with than the relativistic ones, but they appear to allow for a whole realm of FTL phenomena which is not actually violating causality. Some approaches to nuclear forces seem like they get a lot easier if you can postulate FTL signals at the scale of, say, inside an electron.

Carver Mead (the guy who, perhaps more than anyone else really did invent VLSI microchips, and thus is responsible for the computer you're reading this on) also has his own interesting approach to electromagnetism [wikipedia.org] which is much more quantum than classical. Intriguingly like Einstein's own vision of the universe as made of waves, n

I did have a college physics covering relativity but it was a long time ago. Correct me if I am wrong, but Einsteins Special Relativity theory doesn't prohibit speeds faster than light. It just prohibits speeds EQUAL to the speed of light. If so, It would be problematic to accelerate past the speed pf light or to decelerate to slower than the speed of light.

OPERA has just found that either neutrinos travel 0.03% faster than photons we've measured, or their equipment has an unknown systematic error. Assuming there's no equipment error, I would find it more palatable to assume that light around Earth travels a bit below c and that neutrinos travel closer to c. What we think of as vacuum could really be a medium with refractive index 1.0003, perhaps due to a uniform background of weakly-interacting particles (maybe even dark matter) that affect photons but not neutrinos.

I have a physics undergrad degree; if there's someone here with better qualifications, would you care to weigh in on the idea that c could be 0.03% faster than the speed of light we measure on Earth?

My only regret is that the only people who might actually want to invest in 0.03% faster neutrino communication technology are HF traders, so they can shave another 60ns or so advantage from their competitors:-/

But who knows, maybe the galaxy is filled with neutrino-based communications we haven't been tuned into, and someday SETI will pick up their messages of "sell! sell!"

Meh... All jokes aside, if neutrino communications could be managed (in particular, efficient and highly directional emitters/detectors) would be worth a lot more regardless of this experiment.First, light only travels about 66% c in fiber, and said fiber needs to wrap around the circumference of the Earth. So not only would neutrinos travel faster than light in a fiber (as they go pretty close to c for sure), but they could also just so straight there through the Earth. You'd probably be looking at rough

So not only would neutrinos travel faster than light in a fiber (as they go pretty close to c for sure), but they could also just so straight there through the Earth

I was thinking about this but if I understand correctly, neutrinos are "filtered out" as they pass through greater amounts of solid material, so to ensure a signal would go through a great deal of solid material (like the whole planet) it would take a great deal of energy on the emitter side.

I have little more than high-school physics knowledge so I don't know if the amount of energy required alone would make it impractical. But I do know that instruments used to detect neutrinos from the sun are placed und

It's an interesting idea, but quite unlikely... Remember that the speed of light is (supposedly!) an absolute, somewhat like absolute zero, and thing tend to approach it asymptotically. One can therefore tend to see where exactly the asymptote lies, and we'd quite likely notice the difference. For example, particles in the LHC travel at c - 0.0000009% and have the corresponding properties as predicted by relativity. If they were, in fact, traveling at c - 0.03% our calculations should be / are off by over 3 orders of magnitude (gamma 7500 vs 4).

In short, that much error in c would pretty much wreck relativity anyways.

With the caveat that I don't really have better qualifications than you:).

If that were the case, we should be able to accelerate particles to faster than light speeds. There's nothing that prevents a particle from traveling above c in a material with an index of refraction > 1; see Cherenkov radiation [wikimedia.org].

Plausible, but will be really interesting if the neutrinos can travel faster than light. Why? Imagine the possibilities. And I could put another possibility: What if the photon have mass (really small, but not zero) and this mass is slightly larger than that of a neutrino? This would cause the neutrino to be faster than the photon.

What we think of as vacuum could really be a medium with refractive index 1.0003

Ahh, the old subatomic ether thing. Look up michelson-morley interferometer experiment that lead to all that relativity stuff... At 300 ppm, that effect, if it existed, would prevent most interesting interferometer technology from existing. No FFT-IR spectroscopy, most inertial navigation systems would be too drifty to use, astrophysicists would not be able to do the interferometer thing using multiple scopes...

The other problem is we've verified E=mc2 and time dilation to much better than 300 ppm both of which depend on c.

Also, its expensive, and a bit beyond my basement, but your average RF engineer can build stuff to better than 300 ppm on first principles.

Then you start offending the chemists. I have to think about it a bit, but wouldn't this screw up quite a bit of chemistry (and physics) related to ferromagnetic materials? And the NMR scanners wouldn't work right, or at least how they work would depend on the phase of the moon, from memory 300 ppm is a pretty huge shift.

Only an undergrad myself - but I was thinking the same thing. The implications of FTL would enable the creation of thought experiments breaking most known laws of physics (at least as we know them).

On the other hand, light travelling slightly slower than what maybe aught to be called the "causality propagation limit" would only challenge our knowledge about the nature of the vacuum - which is already up for debate. Light already travels slower than c in all substances other than vacuum, and Einstein certain

Surely with an undergraduate degree you did the derivation of the wave equation in free space from Maxwell's equations?

The only part you might have missed (I'm sure you'll have been told it but might not have realized the significance) is that Maxwell's equations are independent of the inertial frame that you pick. And therefore light propagates at c in all inertial frames.

Special relativity is what falls out if you assume that Maxwell's equations are correct.

Hasn't the value of c been verified to within a very tight tolerance, many times, by a great many people, and with much rigor? If we now found c to be different, you'd have to explain a mountain of evidence, ie how did everyone come up with the same number every single time, even though they were doing it in slightly different ways, in different places, and importantly in different frames of reference? Surely the systematic error would have shown up somewhere?

A GPS clock is accurate to within 14ns. Your proposed variation from c would throw it off more than that.

The value of c has been well established over the past century through experimental measurement. That said, I'm not sure how many of those experiments have been done through solid rock. Could the mass in the rock be tunneling the neutrinos somehow?

Well, from what I gather they've not sent photons in a vacuum along the neutrinos, they've simply calculated the distance and found the neutrinos arrive faster than they theoretically should. So the first step would be to double check the distance, atomic clocks and sensor delays, if those are off then the discrepancy would disappear in a puff of smoke.

Like someone pointed out in an earlier article, if neutrinos generally traveled 0.03% faster than light then we'd see delays in years on supernova bursts but

The evidence from supernova 1987A [wikipedia.org] seems to contradict this. Neutrinos from the supernova would have arrived years before the light if c were 0.03% faster than we measure on Earth. Instead they arrived a few hours earlier, which is to be expected, as light from the initial explosion took some time to emerge from the exploding star whereas the neutrinos did not.

What kind of technology and materials would we need to get the giant Fermilab etc. down from square kilometres down to square metres or even inches? Would cheap fusion energy, or room-temperature super-conductors, or limitless supplies of carbon nanotubes/diamond/graphene help reach that particular goal?

What kind of technology and materials would we need to get the giant Fermilab etc. down from square kilometres down to square metres or even inches? Would cheap fusion energy, or room-temperature super-conductors, or limitless supplies of carbon nanotubes/diamond/graphene help reach that particular goal?

A limitless supply of gold would seem to be prerequisite.

Seriously though the killer is cubed squared law problems. Dump a few megawatts into a few hundred square megameters of "stuff" and it scarcely gets above room temperature. Dump a few watts into a few square cm and you have whats known as a "soldering iron"... Of course with infinite money I suppose you could develop a semiconductor industry designed around a thousand degree operating temperature, with all new substrates and dopants and packaging..

Stronger magnets are always going to be advantageous for a particle accelerator, so yeah, room temperature superconductors (ones that have all the necessary properties to make good electromagnets) would be a major breakthrough. However, in terms of making an accelerator like the Tevatron or the LHC smaller, there are some physical economies of scale that make
see-it-from-space rings more suitable than lab scale. Circular accelerators lose energy due to synchrotron
radiation; these losses are inversely propo

The theory of Relativity still holds true, what this experiment (if it's accurate) changes is our idea of matter and causality: if neutrinos have imaginary mass, they are allowed to traver faster than light, as tachyons; and causality may have to be revised, from a onward moving arrow to a regular dimension, in which the future can influence the past.

I can only assume that they've corrected for General Relativity. Everyone seems to be pointing to the obvious potential sources of error: knowing when the neutrinos are created, knowing when they arrive, knowing the distance that they've traveled.

What about variations in the Earth's gravitational field between the two clocks? Or along the path that the neutrinos follow? You can't call the planet a point-source of gravity - the density of matter is quite lumpy.

I haven't seen a back-of-the-envelope calculation for this...maybe it's orders of magnitudes impossible? Would it require a tiny black hole to throw the timing off by 60ns...or would a big uranium deposit be enough? I could probably do the Lorenz transforms for Special Relativity myself, but General is a bit beyond me!

Interestingly, someone was talking about the start of the "nitpicking" in connection with this experiment. I can't remember the article, or I'd link it, but they mentioned things like adjusting for the oblation of the Earth, tidal pull of the Moon, GPS location of the labs, and such things. Stuff that makes a pretty small difference to the measured distance in % terms, but I guess this needs to be super-accurate.

I'm not even remotely qualified to comment on this, but I seem to remember light being affected by gravity and thus the mass around it, where as neutrinos are virtually unaffected by normal matter. What this says to me is the neutrinos are showing us what the actual speed limit of the universe is compared to what we think it should be as an observer sitting on a giant ball of gravity rich mass. Basically, in space, they go the same speed, which is why the neutrinos and photons from a distant stellar event

It does seem as if natural neutrino sources should have made this effect very apparent years before now if CERN is really seeing what they think they are.I can, however, think of a couple of reasons why this is different though, although I really doubt that this proves FTL Neutrinos are real. The CERN experiment was originally about detecting type change in Neutrinos, with the detector spotting only one type of conversion, (Electron Neutrinos that had converted to Muon Neu

I don't know about Tesla, but this is Yet Another Example... Of the standard scientific method.

You never trust a single result, the experiment always has to be repeated especially in the case of unexpected findings. What I'm really waiting for is data from other accelerators, or experiments (given this experiment may be prohibitively difficult to properly replicate) to corroborate the findings.

Having neutrinos fly at 'true c' rather than a lower 'apparent c' isn't a good solution, because it doesn't take in account neutrino bursts from supernova 1987A. The neutrinos from that supernova were detected only four hours before the light from it. That's explainable with what we know about internal stellar processes. But if the neutrinos were flying FTL then they should have arrived four years earlier.

The most likely explanation for the CERN results (apart from experimental error) is that neutrinos are tachyonic -- they have imaginary mass, and naturally fly faster than light. The higher their energy, the closer to lightspeed they travel.

That's not a trivial situation. To use a technical term, it breaks relativity into itty bitty pieces. We will have to change a lot of theories around. But it's unlikely that the value of c is going to change.

To a neutrino, space and the planet Earth are almost equally transparent. The neutrinos from OPERA and the neutrinos from SN1987A should be travelling at the same c, and they (apparently) aren't.

The one real difference is that the planet has a gravitational field. That could support some theories which suggest that neutrinos are able to take shortcuts through extra dimensions, but only in the presence of a gravity field. That result would still make relativity choke and turn blue, but it might make sense

The most likely explanation for the CERN results (apart from experimental error) is that neutrinos are tachyonic -- they have imaginary mass, and naturally fly faster than light. The higher their energy, the closer to lightspeed they travel.

This would be backwards. SN1987A neutrinos were in the 10MeV range so should be much more super-luminal than the 17GeV neutrinos being measured at CERN. So I think that rules out tachyonic neutrinos.

Assuming that this is a real result, the most likely explanation is goin

If they succeed in recreating the measurements, doesn't it just mean that c was set at too low a value, and that the true speed to light in a vacuum is slightly faster than originally thought?

c is not a fundamental value, its a function of the permeability and permittivity of either empty space or some dielectric (something like inside a piece of coaxial cable, etc). Or rephrased, you are arguing the impedance of free space is wrong, and generations of antenna and RF engineers would disagree with you. Also c shows up in energy mass equivalance e=mc2 and all that which seems quite accurate. And in time dilation experiments it seems to work quite well. Astrophysics "stuff" thats far away seems to confirm that neutrinos do not exceed light speed in vacuum; this test involved blasting thru rock instead of vacuum so that is no huge problem; theres a long history of shoving light thru materials results in weird behavior. Given how many decimal places that kind of stuff has been verified, more than this result which was only 6 sigma or whatever, I'm thinking fundamental constant fine tuning is awful unlikely.

In summary, either its wrong (which seems unlikely given all the verification they did) or its new physics. Simply tuning up the known constants is just not gonna work.

To fit other, higher precision experiments, its gotta boil down to something like the logical inverse of the light refraction law, where light slows down in certain materials (like, say, glass) resulting in refraction and timing issues (like pulse dispersion in optical fiber). The analogy is maybe neutrinos "speed up" when rammed thru solid rock due to some strange property of rocks, or floating about in a rock-produced gravity well, or something like that.

I can totally see how previous subatomic experiments would miss the neutrino effect; after all its hard to shove gammas or plain ole light quanta thru a couple zillion KM of solid rock... Its too technologically hard to do, until trying out the neutrinos...

A good example of how F-ing around in the lab doing blue sky stuff simply because you can, is the primary source of interesting ideas.

Why can't c be just slightly faster than it has been estimated at?I mean no one has ever been able to measure the speed of light in a true vacuum, right? A true vacuum would contain absolutely no particles and no electromagnetic waves. That is impossible to obtain, so how does anyone really know how fast c really is?

Maybe neutrinos are simply lithe enough that they are almost unaffected by the non-vacuum, I mean it has been theorized that to completely block a neutri

c isn't just the speed of light. It's a constant that appears in all kinds of equations: sometimes as the speed of light, sometimes as the permeability of vacuum (Maxwell equations, etc.), sometimes as the ratio between matter and energy (E=mc^2), sometimes as the fundamental ratio between space-like and time-like quantities (relativity, etc.), and so on. It's quite amazing that this same constant comes out with the same value in all these different ways. (And, again, we can measure this constant in totally different experiments and come up with the same value.) This points to a fundamental symmetry in our universe, a realization which gave rise to relativity, quantum physics, and so on.

In short, you shouldn't think of it as merely being the speed that light (or any other particle) travels. It's a fundamental value that is deeply entrenched in just about every branch of physics you can think of. It so happens that it's also the speed that photons travel at. (That's, no accident, of course.) Changing the value of c even slightly would propagate through all of our physics equations, and would lead to totally different predictions for a host of results. (More specifically, we would start getting the wrong predictions for many things!)

So the explanation for this new result must be something rather more subtle than just adjusting c.

Because relativity starts to break down when things exceed the speed of light.

Wikipedia sez [wikipedia.org] that going faster than the speed of light breaks causality -- so signals can be received before they're sent. However, as you suggest there is plenty of room to rework the theory rather than throwing up our hands and declaring reality broken.

It's not so simple. We've measured the speed of light to great precision. We know what that speed is, and we know photons are massless, so we know with very high confidence what the speed of massless particles is. If neutrinos travel faster than light, then this is very surprising and points to something new and interesting. I'm avoiding referring to 'c' because it would be ambiguous: in traditional relativity, the constant speed of light is equal to the maximum possible speed, which is also in essence the ratio between space-like and time-like variables in the theory (the slope of light-cones and all that). It's a constant that reappears over and over again, and marvelously it's precisely equal to the speed of light. It can't be as simple as just "we were wrong, c is a bit higher than we thought" because it would immediately mean that "c" isn't as universal as we thought: the symmetry of the universe must be somehow different so that photons and neutrinos (and probably other particles) follow slightly different rules.

But if this result is indeed true, and neutrinos travel faster than light, then this is truly amazing and could mean different things. One possibility is that different particles actually have different 'speed limits' (and different causal cones), so there is c_light, c_neutrinos, etc. There are many other possibilities (extra dimensions, breaking of Lorentz invariance, imaginary mass, closed timelike curves, etc.). All of them amount to a substantial rethinking to some aspect of physics. This is definitely exciting, since it could be telling us something very new! And it won't be as simple as just adjusting a constant a bit. (If we tweak the value of "c" in our equations even just a bit, all kinds of well-tested observations, in everything from cosmology to the functioning of transistors, would come out wrong...).

Lastly, it's worth keeping in mind that it's probably a subtle experimental error (very subtle!). This is still useful, because it will teach us something new about experiment design and possibly even teach us something about particle physics. For instance, the timing calculation is based on certain models of the packet of neutrinos that are generated. But, it could be that the packet that arrives at the end is slightly different than the one sent out at the beginning, thus altering the way one should compute the flight time. This could point to some interesting, previously unknown, ways in which neutrinos are generated, or interact with matter, or interact with each other. In any case it will be interesting.

If they succeed in recreating the measurements, doesn't it just mean that c was set at too low a value, and that the true speed to light in a vacuum is slightly faster than originally thought?

No, probably not. Einstein came up with relativity after a thought experiment concerning what a light wave would look like if you were traveling at its velocity. Electro-magnetisim does not allow for a stationary vacuum solution, so he figured out that the way out was to have time stopped at the speed of light. If the

1. Yes, they can see each other. Why should they not? You can hear a fighter jet flying faster than the speed of sound easily, similar here. But don't get confused: you will only ever be able to see the past "image" of the other object, this image is traveling towards you with c.

2. More than 1 year. Your idea of a rod is not quite right. Think of it like a big rubber band, then make it stiffer and stiffer. If you pull too hard you would rip off one end but in any case they would probably not notice it for w

Another thought experiment. a 1" rod 1 light year long. you move it 1/16 of a centimeter. How long does it take for the movement to register at the other end?A: its instant, for it does not need to move any faster than the time it took you to move it.

The atoms of the material would carry the signal at the speed of light to the other end. There would be a compression wave travelling along the rod. It would seem to you as if the whole rod moved, but in reality the effect wouldn't be instantly noticeable at the other end. The rod wouldn't have to be so long either, you could measure the same delay using a much shorter rod. Try tapping your mobile phone. The other end seems to move instantly, but in reality there is a delay.

2. What if it was a single atom? If you "push" an atom, does the other side move immediately or is there a small delay? I'm guessing there's a small delay, or we'd be able to transmit data faster than the speed at light, albeit only at relatively small distances, such as the width of an atom.