Explaining EPR after Bell's inequalities

We (three students from the Netherlands) are working on a project on Bell's inequalities. We have studied the original EPR-paper from 1935, which states that quantum mechanics may well be an incomplete theory. Reactions on this paper. Von Neumann's completeness theorem. The Kochen Specker theorem.
And lately the derivation of Bell's inequalities, (in both deterministic and indeterministic case).

Now we are wondering which explanations of the correlations that are found are still possible. Clearly no local hidden variables. But just saying: the universe is non-local, seems to be not very satisfying.

Hopefully there are some people here that would like to discuss this interesting subject with us.

There are many responses to the violation of Bell inequalities. This issue is quite subtle and has been picked apart in fine detail by physicists and philosophers. There are two main groups of responses. The first tries to maintain local realism by pointing out loopholes in the proof of Bell's theorem or the actual experimental tests that have been performed. Most physicists (but not everyone) feel that these arguments are ad-hoc and that they would require a strange "conspiracy" in nature to explain the current results. Here are some of these responses:

1. There has been no loophole-free experimental test of Bell's inequalities. If one is ever performed, it will fail to show quantum correlations. Experimentalists believe they will be able to perform a loophole free test in the next few years. The loopholes include

- detection efficiency: All experimental tests do not detect 100% the particle pairs emitted. It is uaually assumed that the particles detected represent a "fair sample". The problem with this was first pointed out by Pearle (Phys. Rev. D. 2 1418-1425 (1970)).

- causality: The choice of measurement settings should be "truly random" and they should be made at spakelike separation from the other measurement. This does not apply to all experimental tests and unfortunately it does not apply to most of the ones with high detection efficiency.

2. There is something wrong with the assumptions in Bell's theorem. There are numerous get-outs that can be invoked. For example, it is usually assumed that the experimental settings can be controlled. This requires an assumption of free-will of the experimenter, or at least that there are physical processes independent of the Bell experiment that can be used as an effective source of randomness (http://xxx.arxiv.org/abs/quant-ph/0204169 ). One can simply deny that this is the case. Another interesting one is that the usual proof assumes that the sample spaces involved are measurable sets. Toy models can be constructed that reproduce the Bell correlations using non-measurable sets, but these are rather ad hoc (http://edelstein.huji.ac.il/staff/pitowsky/Itamar%20Pitowsky_files/Paper%2001.pdf [Broken] ).

The second group of responses accepts that Bell's theorem is true. Therefore, either locality or realism has to be false.

3. Realism is true, but the universe is non-local. It is possible to construct hidden variable theories that are explicitly non-local and contextual, but are nontheless fairly natural looking. Bohmian mechanics is the most famous example of this (http://plato.stanford.edu/entries/qm-bohm/ ).

4. Retreat into operationalism. Quantum mechanics is just a calculus for computing probabilities of measurement outcomes. It says nothing about an underlying reality, which may not even exist, and therefore nothing about its locality. This is similar to the standard Copenhagen response, but to me it seems to beg more questions than it answers. For example, what exactly constitutes a measurement?

5. Consult one of the multitude of interpretations of quantum mechanics and see what it has to say on the matter. Most of them require you to accept at least one dubious idea, but at least you can choose exactly where you want to put the dodgy bit. Visit http://plato.stanford.edu and take your pick.

1. There has been no loophole-free experimental test of Bell's inequalities
Aspect (in Nature '99 Bell's inequality test: more ideal than ever) on an experiment by Zeilinger et al (1998 Phys. Rev. Let. 81, 5039-5043):
"Note that there remains another loophole, due to the limmited efficiency of the detectors, but this can be closed by a technical advance that seems plausible in the foreseeable future, and so does not correspond to a radiacal change in the scheme of experiment. Although such an experiment is highly desirable, we can assume for the sake of argument that the present result will remain unchanged with high-efficiency detectors"

So there is still a small chance that Bell's inequalty is not violated, but i do not believe in it.

2."This requires an assumption of free-will of the experimenter" In my opinion we should not be bothering with such strange thinks as an experimenter without free-will.

I believe point 3 and 4 are the most interesting ones.
I always learned that the Copenhagen interpretation is the correct one. But in fact as you say, i discovered recently, that the problems with it, (the measurement problem; the collapse of the wave function) have not been solved in a satisfactionary way. How strong are experiments in favor of the Zeno Paradox?? (that a watched pot never boils; or better said if you constantly observe a system, nothing happens with it, for the wave function collapses all the time, not letting the system eveolve by the Schrodinger equation)

My conclusion (so far) from this all is that we live in a non-local universe. (point3).

P.S. In fact what they are testing with the experiments are the Clauser-Horne-Shimony-Holt equations, that framed Bell's inequalities in a way better suited for experiments.

P.P.S See http://gene.science.uva.nl/~skowalcz/Bell.pdf [Broken]

This is more or less a Historical Overview intended as an introduction.

We are working now on GHZ, CHSH, (theory) Aspect-Dalibard-Roger, Zeilinger et al. (Experiments), and also on some stuff on C*-algebras but that's not easy...

Both information you have supplied has been very helpful and continue sources for consideration. But I want to extend this thinking correctly based on the issues of quantum mechanics.

Experiments to test Bell's inequality involve measuring the properties of pairs of particles that are space-like separated in the sense of special relativity: in other words, there is no time for a light signal to travel between them within the duration of the experiment. In a typical Bell's inequality experiment the polarizations of a pair of photons are measured as the relative angle between the axes of polarizers making the measurements is varied.

Quantum mechanics predicts that "non-local" correlations can exist between the particles. This means that if one photon is polarized in, say, the vertical direction, the other will always be polarized in the horizontal direction, no matter how far away it is. However, some physicists argue that this cannot be true and that quantum particles must have local values - known as "hidden variables" - that we cannot measure.

It is a very simple deduction, that if magnetic fields could have been given gaussian coordinates, then a realization of spin characteristics(space) would have been telling features in describing the actions in the gamma ray tests used in GLAST?

How does this effect entanglement as an attempt at geometrical formulation of quantum gravity? If once recognizes the dimensional significance the graviton represents, then any action of that interaction will immediately be foretelling of the space?

2."This requires an assumption of free-will of the experimenter" In my opinion we should not be bothering with such strange thinks as an experimenter without free-will.

I agree, and so do most experts on the Bell inequalities. Free-will is a very philosophically challenging concept and philosophers have not even agreed about whether it is compatible with determinism or not. This is why most people phrase this loophole as something like "there are physical processes independent of the Bell experiment that can be used as an effective source of randomness". It is called the independence assumption.

Most attempts to explain the Bell correlations by violating the independence assumption, involve attributing common causes to physical processes that seem to be independent. These theories seem to involve a conspiracy in nature and so most people believe they are wildly implausible. I recommend getting hold of a copy of Bell's "Speakable and Unspeakable", where he discusses this issue. However, there is one way of avoiding conspiracies. The independence assumption would also be violated in a theory with backward causation, or advanced action. This can be argued for on time symmetry grounds and avoids explicit nonlocality. The quantum mechanics chapters of Price's book "Time's arrow and Archimedes Point" are very good on this issue.

I always learned that the Copenhagen interpretation is the correct one.

But in fact as you say, i discovered recently, that the problems with it, (the measurement problem; the collapse of the wave function) have not been solved in a satisfactionary way.

Well, strictly speaking there is no measurement problem in Copenhagen. The interpretation is constructed to deftly sidestep the wholw issue. It was von Neuman who formulated the measurement problem and collapse postulate.

How strong are experiments in favor of the Zeno Paradox?? (that a watched pot never boils; or better said if you constantly observe a system, nothing happens with it, for the wave function collapses all the time, not letting the system eveolve by the Schrodinger equation)

Well, the issue is quite confusing. There is a disagreement about what exactly constitutes a measurement in Zeno-type experiments. There is also the "anti-Zeno" effect in some of these experiments that seems to go in the other direction.

Incidentally, there is a very simple proof of the Kochen-Specker Theorem, due to Mermin, that you could easily include in your discussion
(Phys. Rev. Lett. (1990), 65, 3373)

I remeber reading about an experiment that excited some atoms and then by keeping looking at them with a fast laser pretty much prohibited the atoms from going back to a their lower energy state again.

Another, more recent experiment with an atto second laser (I believe) allowed a rather non-Quantum mechanical observation: an electron flying around the atom.

I learned QM from Griffith's book. Am I wrong that what he teaches is the Copenhagen interpretation? (Ok, in fact the book is not so much about interpretations, but just doing integrals and calculating stuff)

It was von Neuman who formulated the measurement problem and collapse postulate.

Right. So one of his axioms was indeed the collapse axiom:

(5) If a measurement of the observable [tex]\mathfrak{A} [/tex],
represented by [tex]A[/tex] yields a result between [tex]\lambda_1[/tex], and
[tex]\lambda_2[/tex], then the state of the system immediately after the
measurement is an eigenfunction of [tex]E_{\lambda_2} -
E_{\lambda_1}[/tex].

But another one was:

(4) The time development of the state vector [tex]\phi[/tex] is
determined by the equation [tex]H \phi = i \hbar \partial \phi /
\partial t [/tex], known as the Schrodinger equation, where the Hamiltonian [tex]H[/tex] is the evolution operator and [tex]\hbar[/tex] is Planck's constant divided by [tex] 2 \pi[/tex].

What exactly is a measurement? And when is
the (5) projection postulate supposed to take over from the
(4) Schrodinger dynamics?

I remeber reading about an experiment that excited some atoms and then by keeping looking at them with a fast laser pretty much prohibited the atoms from going back to a their lower energy state again.

See Griffiths, 'introduction to QM', afterword A4. He calculates the probabilty that a system is still in the excited state after a certain time in wich N observations are done. Taking N->infty implies that a continously observed system never decays.

If this is really true (no decisive exp. have been performed yet) .. QM gives an absurd prediction. So some authors claim the wave function does not collapse, and von Neumann's fifth postulate is untenuable.