Thursday, June 30, 2011

The Fabric of the Cosmos
Acclaimed physicist Brian Greene reveals a mind-boggling reality beneath the surface of our everyday world. Airing November 2, 9, 16, and 23, 2011 on PBS...

Greene brings quantum mechanics to life in a nightclub like no other, where objects pop in and out of existence, and things over here can affect others over there, instantaneously and without anything crossing the space between them. ...

Hard as it is to swallow, cutting-edge theories are suggesting that our universe may not be the only universe. Instead, it may be just one of an infinite number of worlds that make up the multiverse. In this show, Brian Greene takes us on a tour of this brave new theory at the frontier of physics, explaining why scientists believe it's true and showing what some of these alternate realities might be like.

No one has even been able to produce an experiment where "things over here can affect others over there". The multiverse is more theology than physics. No real scientists believe that it's true.

Tuesday, June 28, 2011

Among the many people in San Francisco taking drugs in the early 1970s were members of a maverick group of Berkeley physicists who called themselves the Fundamental Fysiks Group. The young scientists dabbled in mind-altering drugs as they searched for a quantum-physics-based explanation for such phenomena as telepathy and extrasensory perception. The scientific basis for this quest was the experimental confirmation that once two quantum entities (such as electrons) have interacted with one another, they remain connected by what Einstein called "spooky action at a distance." The connection is technically known as entanglement; if one of the entities is prodded, the other one jumps.

As David Kaiser deftly spells out in "How the Hippies Saved Physics," these physicists based their work on good science, however drug-fogged were their aims. Entanglement is at the heart of today's uncrackable quantum encryption; it makes the "teleporting" of particles over distances of several miles feasible; and entanglement may soon be employed in the production of quantum computers that will make the best contemporary computer look like an abacus.

No, Einstein was wrong, and there is no such "experimental confirmation". I hope to explain that in detail later. It is partial explained in How Einstein Ruined Physics. I also mention this book below.

Gribbin once wrote a wacky book on The Jupiter Effect, as criticized here. Since that book, he has been a more mainstream science journalist, and some of his books explain physics pretty well. He goes on:

Mr. Kaiser makes a neat analogy with the way, in the 19th century, people would try to invent perpetual-motion machines. In trying to explain why perpetual-motion machines could not work, physicists were led to a deeper understanding of physics, one that became a foundation of thermodynamics. The moral is that it is always useful to have a few mavericks prodding away at the fringes of science to keep folks on their toes.

As for quantum cryptography, it has been making steady advances since the day in 2004 when it was employed for a secure communications channel in a financial transaction between a major bank and the Viennese City Hall. Similar signals have been tested using wireless transmissions over a distance of about 100 miles, sufficient for them to be bounced off Earth-orbiting communications satellites in the future. Before long, the Internet is likely to be using quantum cryptography, making it impossible for hackers to intercept your credit-card details when you make a purchase. Articles about quantum physics now appear on newspaper business pages as well is in Scientific American.

I accept that analogy. I believe that quantum cryptography and quantum computers are just like those 19th century perpetual-motion machines, and will never work because they are contrary to established laws of physics. At best, their failure will convince people that Bohr and others were right about quantum mechanics all along.

Monday, June 27, 2011

The current (July) SciAm interviews Leonard Susskind as a physics bad boy. Susskind praises Thomas Kuhn for paradigm shift theory, and gives Einstein's sudden discovery of relativity and spacetime geometry as his first example.

He is a big proponent of string theory, but he is down on reality because string theory will never say anything about reality. Now he likes the multiverse.

The Susskind interview is also criticized here. I also criticize him in my book. He was once a highly respected physicist.

Saturday, June 25, 2011

Steve: Early in the history, the modern history of string theory formulations, there were some physicists who really didn't like string theory, because it wasn't testable enough to be other than — in their opinion — kind of, philosophical musings; and they thought it wasn't even really science. And how has the field progressed since then?

... that's actually a criticism, as I try to discuss in the book, not specific to string theory. It's also true of the various alternatives to string theory; ...

Yeah, the Large Hadron Collider will really be the most closely watched instrument in physical science, at least over the next few years. It is actually the most expensive scientific instrument of any sort ever built. ... Now it's, again — as I've emphasized earlier — it's not a question of strictly proving or strictly disproving string theory; that's beyond even the Hadron Collider's ability. It's more of a hint level.

So the most expensive scientific instrument ever built won't say anything about whether string theory is true or false. And the problem is not just with string theory but also with many other theoretical models, as I also explain in my book.

So, at the dawn of our universe — and I have to emphasize our universe, because there could be others — so, dawn of our universe, physicists think there was one type of force, one type of matter and that as the cosmos expanded, as space expanded, it cooled and things started to condense out like snow flakes, and over time that single force broke, it differentiated; and something similar happens in the human body as we develop from a single cell; we differentiate, different tissues form in our bodies, different layers of tissues. Something similar happened, physicists think, in our universe, that over time this single force somehow differentiated into the four forces that we know today.

This is the core of his belief. It appeals to his aesthetics that there might have been a moment during the big bang where gravity and electromagnetism had comparable strength. He wants to believe that, even if he has to believe in 6 extra dimensions. It is like an argument to believe in one god instead of many gods.

Thursday, June 23, 2011

Despite quantum theory's knack for explaining experimental results, some physicists have found its weirdness too much to swallow. Albert Einstein mocked entanglement, a notion at the heart of quantum theory in which the properties of one particle can immediately affect those of another regardless of the distance between them. He argued that some invisible classical physics, known as "hidden-variable theories", must be creating the illusion of what he called "spooky action at a distance".

A series of painstakingly designed experiments has since shown that Einstein was wrong: entanglement is real and no hidden-variable theories can explain its weird effects. ...

They found that the resulting statistics could only be explained if the combination of properties that was tested was affecting the value of the property being measured. "There is no sense in assuming that what we do not measure about a system has [an independent] reality," Zeilinger concludes.

Steinberg is impressed: "This is a beautiful experiment." If previous experiments testing entanglement shut the door on hidden variables theories, the latest work seals it tight. "It appears that you can't even conceive of a theory where specific observables would have definite values that are independent of the other things you measure," adds Steinberg.

There are dozens of papers like this, claiming to have finally proved that Einstein was wrong in 1935. I thought that everyone was convinced that Einstein was wrong by 1936, so I am not sure what the big deal is. Von Neumann had already published a quantum mechanics book in 1932 with an argument against hidden variable theory.

Physicists in China have broken their own record for the number of photons entangled in a "Schrödinger's cat state". They have managed to entangle eight photons in the state, beating the previous record of six, which they set in 2007. The Schrödinger's cat state plays an important role in several quantum-computing and metrology protocols. However, it is very easily destroyed when photons interact with their surroundings, prompting the researchers to describe its creation in eight photons as "state of the art" in quantum control.

Wednesday, June 22, 2011

It is widely believed that the principal difference between Einstein's special relativity and its contemporary rival Lorentz-type theories was that while the Lorentz-type theories were also capable of “explaining away” the null result of the Michelson-Morley experiment and other experimental findings by means of the distortions of moving measuring-rods and moving clocks, special relativity revealed more fundamental new facts about the geometry of space-time behind these phenomena. I shall argue that special relativity tells us nothing new about the geometry of space-time, in comparison with the pre-relativistic Galileo-invariant conceptions; it simply calls something else "space-time", and this something else has different properties. All statements of special relativity about those features of reality that correspond to the original meaning of the terms "space" and "time" are identical with the corresponding traditional pre-relativistic statements. It will be also argued that special relativity and Lorentz theory are completely identical in both senses, as theories about space-time and as theories about the behavior of moving physical objects. ...

They are not only “empirically equivalent”, as sometimes claimed, but they are identical in all sense; they are identical physical theories.

Consequently, in comparison with the classical Galileo-invariant conceptions, special relativity theory tells us nothing new about the spatiotemporal features of the phys-ical world. As we have seen, the longstanding belief that it does is the result of a simple but subversive terminological confusion.

He notes that others (including Einstein himself) have acknowledged that Lorentz's and Einstein's theories have the same empirical consequences, but conventional wisdom is that Einstein's special relativity is a superior theory. Szabo argues that they are really the same.

As I detail in my book, when Einstein's famous paper was published in 1905, no one thought that it was saying anything different from what Lorentz said years earlier. Not even Einstein claimed that it was different from the well-known Lorentz electron theory.

Szabo explains that in 1920 Einstein did try to argue that his special relativity was superior because he did not refer to the aether. But Szabo also notes that by 1920 Einstein himself was advocating existence of the aether.

My only disagreement with Szabo is that he sloughs over some historical details.

Though, it is a historic fact that Lorentz, FitzGerald, and Larmor, in contrast to Einstein, made an attempt to understand how these laws actually come about from the molecular forces.

Einstein's own story is that he tried to understand that, but he failed, and did not publish his attempts. Some people argue that Einstein's approach was superior because he did not publish such attempts, but that was not his view. Einstein thought that such a molecular explanation was important and desirable.

Actually, Einstein completely missed this interpretation in his 1905 paper, and did not even understand it when Minkowski published it 3 years later. It is historically inaccurate to call this Einstein's theory.

32. Many of those, like Einstein himself (see Point 25), who admit the “empirical equivalence” of the Lorentz theory and special relativity argue that the latter is “incomparably more satisfactory” (Einstein) because it has no reference to the aether. As it is obvious from the previous sections, we did not make any reference to the aether in the logical reconstruction of the Lorentz theory. It is however a historic fact that, for example, Lorentz did.

No, it is not a historical fact. Lorentz mentions the aether only to explain why he was rejecting the aether drift theories. Einstein mentions the aether in 1905 to say that it is superfluous to his derivation. There is no significant difference between what Lorentz and Einstein said about the aether. My book has the details.

I do agree with Szabo's main point that support for Einstein's originality over Lorentz stems from a "subversive terminological confusion."

Monday, June 20, 2011

The quantum computer folks brag that their greatest accomplishment is quantum factoring, and the proof is the experiment that factored the number 15. But those claims are bogus, as Nature magazine reports:

The notion that quantum computing can be done only through entanglement was cemented in 1994, when Peter Shor, a mathematician at the Massachusetts Institute of Technology in Cambridge, devised an entanglement-based algorithm2 that could factorize large numbers ...

Clues that entanglement isn't essential after all began to trickle in about a decade ago, with the first examples of rudimentary quantum computation. In 2001, for instance, physicists at IBM's Almaden Research Center in San Jose and Stanford University, both in California, used a 7-qubit system to implement Shor's algorithm5, factorizing the number 15 into 5 and 3. But controversy erupted over whether the experiments deserved to be called quantum computing, says Carlton Caves, a quantum physicist at the University of New Mexico (UNM) in Albuquerque.

The trouble was that the computations were done at room temperature, using liquid-based nuclear magnetic resonance (NMR) systems, in which information is encoded in atomic nuclei using an internal quantum property known as spin. Caves and his colleagues had already shown6 that entanglement could not be sustained in these conditions. "The nuclear spins would just be jostled about too much for them to stay lined up neatly," says Caves. According to the orthodoxy, no entanglement meant no quantum computation.

The NMR community gradually accepted that they had no entanglement, says Jiangfeng Du, an NMR-computing specialist at the University of Science and Technology of China, in Hefei. [Published online 1 June 2011 | Nature 474, 24-26 (2011) | doi:10.1038/474024a]

The article argues that the factoring of 15 was still legitimate, even if it did not use quantum computing. But a primitive 1940s computer could factor 15. If no entanglement was used, then the result is trivial and the whole thing is a hoax.

Sunday, June 19, 2011

Physics World reports on researchers demonstrating a full eavesdropper on a quantum key distribution link. Unlike conventional exploits for security vulnerabilities that are often just a piece of software, spying on quantum cryptography required a box full of optics and mixed-signal electronics. Details are published in Nature Communications, and as a free preprint. The vulnerability was known before, but this is the first actual working exploit with secret-key recording confirmed. Patching this loophole is in progress.

I have pointed out defects in quantum cryptography before. The proponents claim that quantum physics proves that their systems are unbreakable, but they misunderstand the physics. Every such system has been broken.

Thursday, June 16, 2011

Most discussions of double-slit experiments with particles refer to Feynman's quote in his lectures: "We choose to examine a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery." Feynman went on to add: "We should say right away that you should not try to set up this experiment. This experiment has never been done in just this way. The trouble is that the apparatus would have to be made on an impossibly small scale to show the effects we are interested in. We are doing a "thought experiment", which we have chosen because it is easy to think about. We know the results that would be obtained because there are many experiments that have been done, in which the scale and the proportions have been chosen to show the effects we shall describe".

Wednesday, June 15, 2011

In the 1970s, some usual populist quantum physics books were published, such as The Tao of Physics and The Dancing Wu Li Masters. These authors were versed in modern physics, but they preached a lesson of Eastern mysticism. I thought that these books were a passing fad.

Tuesday, June 14, 2011

Most working scientists hold fast to the concept of 'realism'—a viewpoint according to which an external reality exists independent of observation. But quantum physics has shattered some of our cornerstone beliefs. According to Bell's theorem, any theory that is based on the joint assumption of realism and locality (meaning that local events cannot be affected by actions in space-like separated regions) is at variance with certain quantum predictions. Experiments with entangled pairs of particles have amply confirmed these quantum predictions, thus rendering local realistic theories untenable. Maintaining realism as a fundamental concept would therefore necessitate the introduction of 'spooky' actions that defy locality. ... Our result suggests that giving up the concept of locality is not sufficient to be consistent with quantum experiments, unless certain intuitive features of realism are abandoned.

If these statements were true, then there would be Nobel Prizes given to those who established them. Textbooks would describe this as one of the great and profound discoveries of all time. And if these ideas really refuted three millennia of scientific thought, what replaced reality and causality?

Non-local hidden-variable theory is much stranger than quantum mechanics. An experiment reject such a strange theory tells us nothing, and certainly does not justify the radical conclusions above. If there is something wrong with reality and causality, it is going to take some very convincing experiments.

Monday, June 13, 2011

“Anybody who’s not bothered by Bell’s theorem has to have rocks in his head.”
To this moderate point of view I would only add the observation that contemporary physicists come in two varieties.
Type 1 physicists are bothered by EPR and Bell’s theorem.
Type 2 (the majority) are not, but one has to distinguish two subvarieties.
Type 2a physicists explain why they are not bothered. Their explanations tend either to miss the point entirely (like Born’s to Einstein) or to contain physical assertions that can be shown to be false.
Type 2b are not bothered and refuse to explain why. Their position is unassailable. (There is a variant of type 2b who say that Bohr straightened out the whole business, but refuse to explain how.

The physicists would only be bothered or confused if they listen to explanations like Mermin's. Bell only showed the impossibility of a local hidden variable theory. No one has been able to make such a theory work anyway. I would be bothered if there were any evidence for a local hidden variable theory.

Sunday, June 12, 2011

Peter Woit has a review of Jim Baggott’s The Quantum Story: A History in 40 Moments. The book says:

This means that a quantum computer could in principle be used to crack the encryption systems used for most Internet transactions, which are based on factoring large prime numbers. Not to worry, though, as Internet security could perhaps be restored using cryptography systems based on quantum entanglement! [p.348, fn.12]

No. Prime numbers cannot be factored. Quantum computers will not be cracking anything. Quantum entanglement is useless for internet security.

It looks like a good book, as long as you don't take its conclusions seriously.

Friday, June 10, 2011

The Belgian priest Georges Lemaître discovered the Big Bang theory in 1927. He had both theory and data for the expansion of the universe. It is strange that Hubble was credited for it, when his 1929 paper was not as good.

The 1927 discovery of the expansion of the Universe by Lemaitre was published in French in a low-impact journal. In the 1931 high-impact English translation of this article a critical equation was changed by omitting reference to what is now known as the Hubble constant. That the section of the text of this paper dealing with the expansion of the Universe was also deleted from that English translation suggests a deliberate omission by the unknown translator. ...

(Lemaître 1927) to have been the first to find both observational and theoretical evidence for the expansion of the Universe. His observational discovery was based on the published distances and radial velocities of 42 galaxies. Lemaître’s theretical result was based on the finding that the Universe is unstable, so that pertubations tend to grow. These results, which were published in French and in a relatively obscure journal, anticipated the work of Edwin Hubble (1929) by two years. It might therefore have been appropriate to assign the credit for the discovery of the expansion of the Universe to Lemaître, rather than to Hubble (Peebles 1984).

In summary it appears that the translator of Lemaître’s 1927 article deliberately deleted those parts of the paper that dealt with the determination of what is presently referred to as the Hubble parameter. The reason for this remains a mystery.

Okay, the translator was dishonest, but that does not explain the rest of the physics community. Lemaitre was well-known, and his theory was denounced by Einstein and others before Hubble. Einstein said, "Your math is correct, but your physics is abominable." Someone should have had the decency to admit that Lemaitre was right, and well ahead of Hubble.

Hubble supposedly had better data, as he used the world's biggest telescope. But his data was off by a factor of 10 or more, and not nearly as convincing as he pretended it was.

The year 1931 can undoubtedly be called Georges Lemaitre's annus mirabilis. Indeed, major contributions to relativistic cosmology by the Belgian physicist and priest appeared within a few months: ...

Lemaitre initially intended to conclude his letter to Nature by “I think that every one who believes in a supreme being supporting every being and every acting, believes also that God is essentially hidden and may be glad to see how present physics provides a veil hiding the creation".

Luminet says that Lemaitre removed the mention of God, but we do not know exactly why. Maybe he knew that the editors would not accept it.

Update: The new paperDid Edwin Hubble plagiarize? responds to Block, and disputes the idea that Hubble got his ideas from Lemaitre. That may be, but it still seems clear that Lemaitre did the work first, and that there was some sort of conspiracy not to credit him.

Thursday, June 9, 2011

It is ironic that Einstein's most creative work, the general theory of relativity, should boil down to conceptualizing space as a medium when his original premise [in special relativity] was that no such medium existed ...

The word 'ether' has extremely negative connotations in theoretical physics because of its past association with opposition to relativity. This is unfortunate because, stripped of these connotations, it rather nicely captures the way most physicists actually think about the vacuum. ... Relativity actually says nothing about the existence or nonexistence of matter pervading the universe, only that any such matter must have relativistic symmetry.

It turns out that such matter exists. About the time relativity was becoming accepted, studies of radioactivity began showing that the empty vacuum of space had spectroscopic structure similar to that of ordinary quantum solids and fluids. Subsequent studies with large particle accelerators have now led us to understand that space is more like a piece of window glass than ideal Newtonian emptiness. It is filled with 'stuff' that is normally transparent but can be made visible by hitting it sufficiently hard to knock out a part. The modern concept of the vacuum of space, confirmed every day by experiment, is a relativistic ether. But we do not call it this because it is taboo.

That is correct. The concept of the aether is universally accepted among physicists today, but they prefer to call it something else because of historical prejudices. Aether theory is one of the most accepted theories today.

The confusion about aether and relativity is almost entirely due to Einstein. His first relativity paper avoided the aether, and failed to address what the discoverers of relativity said about it. Then he tried to claim that abolishing the aether was his original contribution to relativity. Years later, he denied that, and said that a gravitational aether was essential. He never expressed an opinion about the aether that is necessary for quantum mechanics, but he did not even believe in quantum mechanics. Those who idolize Einstein frequently cite his opposition to the aether as his great genius idea.

I think that physicists have decided that using the word aether is disrespectful to Einstein, and that is why they don't like it.

Tuesday, June 7, 2011

Apparently some physicists reject one of the fundamental premises of quantum mechanics, such as this:

At least one physicist considers the “wave-duality” a misnomer, as L. Ballentine, Quantum Mechanics, A Modern Development, p. 4, explains:

When first discovered, particle diffraction was a source of great puzzlement. Are "particles" really "waves?" In the early experiments, the diffraction patterns were detected holistically by means of a photographic plate, which could not detect individual particles. As a result, the notion grew that particle and wave properties were mutually incompatible, or complementary, in the sense that different measurement apparatuses would be required to observe them. That idea, however, was only an unfortunate generalization from a technological limitation. Today it is possible to detect the arrival of individual electrons, and to see the diffraction pattern emerge as a statistical pattern made up of many small spots (Tonomura et al., 1989). Evidently, quantum particles are indeed particles, but whose behaviour is very different from classical physics would have us to expect.

Physicists at Rice University have completed the first real-time measurement of individual electrons, creating an experimental method that for the first time allows scientists to probe the dynamic interactions between the smallest atomic particles.

The research, which appears in the May 22 issue of the journal Nature, is important for researchers developing quantum computers, a revolutionary type of computer that is orders of magnitude more powerful than any computer ever built.

To date, computers have used the binary bit — represented by either a one or zero — as their fundamental unit of information. In a quantum computer, the fundamental unit is a quantum bit, or qubit. Because qubits can have more than two states, calculations that would take a supercomputer years to finish will take a quantum computer mere seconds.

Due to the complexities of quantum dynamics, electrons can serve as qubits. They can exist in "up" and "down" states -- single points that are analogous to the ones and zeroes in classical computers -- or in "superposition" states, which are not single points but patterns of probability that exist in several places at once.

All these high-tech experiments haven't given us any better understanding of electrons than Bohr had, and haven't brought us any closer to qubits. Bohr's complementarity arguments are just as valid today. Saying that particles have been detected makes no more sense than saying that waves have been detected.

But the same heliocentric argument was made two millennia earlier by Pythagoras and Aristarchus of Samos. Einstein did not even really believe that the Earth's motion was undetectable, as he believed (incorrectly) that it caused the ocean tides.

The modern relativity principle comes from Maxwell, Lorentz, and Poincare. Einstein had nothing to do with it. The details are in my book.

Saturday, June 4, 2011

An international group of physicists has found a way of measuring both the position and the momentum of photons passing through the double-slit experiment, upending the idea that it is impossible to measure both properties in the lab at the same time. ... Steinberg stresses that his group's work does not challenge the uncertainty principle, pointing out that the results could, in principle, be predicted with standard quantum mechanics.

This is nonsense, of course. The experiment does not contradict the Uncertainty principle. It is an attempt at weak measurement, but it is questionable whether the concept even makes sense.

The UK BBC story also suggests that the experiment has done the impossible:

Researchers have bent one of the most basic rules of quantum mechanics, a counterintuitive branch of physics that deals with atomic-scale interactions.

Its "complementarity" rule asserts that it is impossible to observe light behaving as both a wave and a particle, though it is strictly both.

In an experiment reported in Science, researchers have now done exactly that.

They say the feat "pulls back the veil" on quantum reality in a way that was thought to be prohibited by theory. ...

While they were able to easily observe the interference pattern indicative of the wave nature of light, they were able also to see from which slits the photons had come, a sure sign of their particle nature.

The trajectories of the photons within the experiment - forbidden in a sense by the laws of physics - have been laid bare.

No, this experiment has not shown the particle nature of photons any more than previous experiments. Here is the conclusion from the actual research paper:

Single-particle trajectories measured in this fashion reproduce those predicted by the Bohm–de Broglie interpretation of quantum mechanics (8), although the reconstruction is in no way dependent on a choice of interpretation.

Controversy surrounding the role of measurement in quantum mechanics is as old as the quantum theory itself, and nowhere have the paradoxes been thrown into such stark relief as in the context of the double-slit experiment. Our experimentally observed trajectories provide an intuitive picture of the way in which a single particle interferes with itself. It is of course impossible to rigorously discuss the trajectory of an individual particle, but in a well-defined operational sense we gain information about the average momentum of the particle at each position within the interferometer, leading to a set of “average trajectories.” The exact interpretation of these observed trajectories will require continued investigation, but theseweak-measurement results can be grounded in experimental measurements that promise to elucidate a broad range of quan tum phenomena (7, 11–13, 15–17). By using the power of weak measurements, we are able to provide a new perspective on the double-slit experiment, which Feynman famously considered to have in it “the heart of quantum mechanics” (27).

This seems to give a boost to the De Broglie–Bohm theory. That theory has had a cult following for decades, but nothing useful has ever come out of as far as I know. The main advantage is that it is supposed to be more intuitive, but I think that it is much stranger than conventional quantum mechanics.

The Double-slit experiment is 200 years old, but continues to cause confusing today. An animated video explaining it is here. This paper uses averages to gain an "intuitive picture" of how light gets from the slits to the detectors, but that's all. It is not clear that it is measuring anything real.

Friday, June 3, 2011

As of 2011, no sane person seriously doubts that string theory is the right framework that describes the Universe around us. However, there remain uncertainties about the inner structure of the relevant string vacuum. String/M-theory possesses many four-dimensional solutions that qualitatively resemble our world.

Those solutions are connected on the configuration space of the theory and may be related by various dualities. However, the known stringy descriptions that are semi-realistic and weakly coupled may still be divided into several major categories:

What he is saying here is that string theory postulates an 11-dimensional space with zillions of possibilities for the vacuum, and no one has found one that matches our 4-dimensional spacetime vacuum. Some of the models have some qualitative resemblance to the vacuum, but that's all. The vacuum is what used to be called the aether.

A reader named Daniel responds:

I imagine that is a tongue-in-cheek comment, written with a smirk on your face. Am I right?

No, there has been no toungue-in-cheek comment in my text, except for the lolcat. It's a totally serious technical text, unlike your comment.

Well, I have no doubt that at least 90% of people don't have any clue about physics - I actually think that the number is above 99%. People are dumb as a doorknob, indeed. ...
I've banned Daniel after his third obnoxious comment. Just to make sure, this thread is dedicated to low-energy models of string theory.

Motl posts some good stuff sometimes, but he is off the rails. String theory has no chance of being correct.

Thursday, June 2, 2011

It could turn out to be a milestone for quantum computing. Last week, D-Wave Systems of Burnaby in British Columbia, Canada, announced the first sale of a commercial quantum computer, to global security firm Lockheed Martin, based in Bethesda, Maryland.

D-Wave's co-founder, Geordie Rose, says that the sale demonstrates that quantum computing is finally living up to its decades-long promise. Aaronson, however, thinks that the computer-science community will need more convincing. "Just because a flagship company has bought the system, doesn't mean that it now works," he says.

That mistrust goes back to 2007, when D-Wave apparently demonstrated a 16-qubit computer that could solve a Sudoku puzzle. Many computer scientists and physicists suggested that the device was actually being driven by plain old classical physics. At the time, D-Wave did not respond with any publications ruling out this possibility.

I don't doubt that it exploits quantum physics. It is hard to make a computer that does not. The magazine has another article, the current June 2011 cover story, which says: "Quantum mechanics is not just about teeny particles. It applies to things of all sizes: birds, plants, maybe even people". But I do doubt that any true qubits are involved.

Vlatko Vedral, the author of that SciAm article, has a separate claim that you can cool a quantum computer by erasing an entangled bit. No, I do not believe that anyone will even succeed in such an experiment.

Wednesday, June 1, 2011

Dr. SkySkull is an optical physics professor, and he writes an informative post about the optical history of relativity:

The best stories in the history of physics are those in which someone comes from humble origins and, seemingly out of nowhere, makes a brilliant discovery that changes everything. Such stories, however, can give a very misleading impression of the nature of scientific progress: science is a continuous process, and a closer inspection of any incredible breakthrough always reveals that there were numerous earlier discoveries that anticipated it.

A great case study of this is Einstein’s special theory of relativity, introduced in 1905. Einstein’s groundbreaking work transformed mankind’s perceptions of space and time, provided answers to puzzling problems and led directly to other major discoveries, including the harnessing of nuclear energy. However, Einstein’s revelations were preceded by some twenty years of gradual progress, during which time researchers put forth tantalizing hypotheses that came closer and closer to the truth.

One such discovery was made in 1889 by George FitzGerald. To explain a seemingly incomprehensible experimental result, he suggested that objects in motion shrink along their direction of travel. In this post, we will discuss what is now known as the FitzGerald-Lorentz length contraction and explain how FitzGerald fell short of the astonishing ideas that would be conceived by Einstein.

He gives a nice explanation of how FitzGerald deduced the contraction from Michelson-Morley, and then how Einstein used essentially the same argument to get the same formula 16 years later. But he concludes:

Herein lies the difference between the FitzGerald form of length contraction and the Einstein form. In the FitzGerald version, both the first and second observer can agree that the experiment is moving. The observer moving with the experiment would in principle be able to measure a change in the electrical forces between molecules, demonstrating a motion in the aether. In the Einstein version, the two observers agree that the speed of light is c, but disagree on the distances that the light has to travel.

No, that is not a difference. Dr. SkySkull quotes the FitzGerald 1889 paper. The paper does not say that any such change in electrical forces could be measured. The Michelson-Morley experiment failed to detect any motion thru the aether, and FitzGerald did not say that any such motion was detectable.

Dr. SkySkull goes on:

Einstein’s theory represented a huge change in mankind’s perception of nature and the universe. Space and time were relegated from absolute quantities to relative ones depending on an observer’s state of relative motion. Nevertheless, others were already anticipating Einstein’s discovery before him. Lorentz, who was a proponent of the FitzGerald contraction hypothesis, also derived a mathematical transformation between observers — the Lorentz transformation — that left the speed of light constant for all observers. This transformation is now a part of Einstein’s relativity. Other scientists such as Ernst Mach and Henri Poincaré had already in the late 1800s begun to dismiss the notion of absolute space and time, and the aether along with it, and propose a new theory of relativity. In fact, one distinguished scientist stubbornly refused to accept any real significance in Einstein’s relativity work.

Einstein, however, was the one to bring all the scattered musings of the different scientists together into a new principle of physics. Where the others were inching towards a new theory of relativity, Einstein leapt towards it wholeheartedly. It was a great discovery, but the earlier efforts of others such as FitzGerald show that Einstein’s work was part of a greater effort to understand the nature of space and time.

No, Einstein had no new principle. His principles are present in that 1889 FitzGerald paper, as you can see in SkySkull's explanation. FitzGerald, Lorentz, and Poincare were much bolder than Einstein. By the time that Einstein published his explanation in 1905, it was the more conventional of the competing explanations.

In a 2008 post, SkySkull calls Edmund Whittaker an anti-relativist. Obviously he hasn't read Whittaker's book, but it is very enthusiastic about special and general relativity, and is very generous in over-crediting Einstein for general relativity.

SkySkull quotes a Max Born letter to Einstein:

Whittaker, the old mathematician, who lives here as Professor Emeritus and is a good friend of mine, has written a new edition of his old book History of the Theory of the Ether, of which the second volume has already been published. Among other things it contains a history of the theory of relativity which is peculiar in that Lorentz and Poincaré are credited with its discovery while your papers are treated as less important. Although the book originated in Edinburgh, I am not really afraid you will think that I could be behind it. As a matter of fact I have done everything I could during the last three years to dissuade Whittaker from carrying out his plan, which he had already cherished for a long time and loved to talk about. I re-read the originals of some of the old papers, particularly some rather off-beat ones by Poincaré, and have given Whittaker translations of German papers (for example, I translated many pages of Pauli’s Encyclopaedia article into English with the help of my lecturer, Dr. Schlapp, in order to make it easier for Whittaker to form an opinion). But all in vain. He insisted that everything of importance had already been said by Poincaré, and that Lorentz quite plainly had the physical interpretation. As it happens, I know quite well how sceptical Lorentz was and how long it took him to become a relativist. I have told Whittaker all this, but without success. I am annoyed about this, for he is considered a great authority in the English speaking countries and many people are going to believe him.

Born probably helped convince Whittaker that Einstein's special relativity papers were less important. Everyone who reads and understands those old papers comes to conclusion that Lorentz and Poincare had the whole theory before Einstein.