Saturday, October 5, 2013

A new Quantum Mechanics interpretation

The composability interpretation

We have reached the end of this series of how we should
understand the quantum mechanics’ wavefunction: ontological or epistemological
in the standard sense. The short answer is that it is neither. The long answer
is that a new interpretation is emerging from the project of recovering quantum
mechanics from natural axioms http://arxiv.org/abs/1303.3935.

Wikipedia presents the table of comparison between existing interpretations.

Before showing how the new interpretations measures up in
that table I need to state that the answers are not unique, but they depend on
the context. For example: Is quantum mechanics deterministic? In quantum
mechanics evolution is pure unitary including during the apparent collapse http://arxiv.org/abs/1305.3594 so in
the elliptic
composability context, the answer is yes. As a reminder,
composability means that the laws of nature are invariant under tensor
composition, and elliptic composability is one of the three possible
composability classes: elliptic (quantum mechanics), parabolic (classical
mechanics) and hyperbolic (not present in nature). However, in the usual sense
of looking at predicting concrete experimental outcomes, quantum mechanics is
not deterministic.

Elliptic composability

Parabolic composability

Deterministic?

Yes: the evolution is pure unitary

No: the individual experimental outcomes are unpredictable

Wavefunction real?

Yes, the wavefunction is elliptic-ontic: it has a similar
reality with for example the electromagnetic field with the core difference
that the wavefunction exists in a configuration space.

No in the sense of Heisenberg: the wavefunction represents a probability, but not an objective reality itself in space and time.

Unique History?

Yes

Yes

Hidden Variables?

Does not apply

No

Collapsing wavefunctions?

No

No

Observer role?

No

No

Local?

Locality independent.

No

Counterfactual definiteness?

Yes in a new sense: we can speak about observable algebras
before measurement.

We still cannot speak about results of measurements that have not been performed.

No

Universal wavefunction exists?

Unclear/Undecided

Unclear/Undecided

Contextual?

No in the sense of Gleason’s theorem

Yes in the sense of K-S theorem

The mathematical framework for quantum mechanics is that of
C* algebras/modules. Observables form an algebra and one can define a unique
norm of an element T using the spectral radius, which is the maximum norm
of a complex number lambda such that
“(T-lambda I)” does not have an
inverse. If the norm of T: ||T|| satisfies the C* condition:

||T^{dagger} T|| = ||T||^2

then one can show using GNS construction that T is an operator in a Hilbert space
H.

It is an elementary exercise to show the reverse
implication, but constructing a Hilbert space from the C* condition is a
nontrivial and was first shown by Gelfand in 1942.

Now what struck me the most from that paper is that it is
signed: “Moscow 1942”. In 1942, Nazi troops were about
20 miles away from Moscow ready to burn down the city (before the winter weather
stopped their advance), and instead of packing up and leaving for safety,
Gelfand was making fundamental contributions to functional algebra. Remarkable.

There is a big subtlety relating to Hilbert spaces in
quantum mechanics. There are actually two Hilbert spaces and they are linked
through the Choi-Jamiolkowski isomorphism, or the channel-state duality http://en.wikipedia.org/wiki/Channel-state_duality.

Naively this come from identifying: |psi><phi| with
<phi|psi> but they are very different beasts with different norms. T belongs to a Hilbert space of its own
with the inner product given by Trace(A^dagger
B) where A and B are observables. T
is also an operator on the usual Hilbert space H of the wavefunction. In the
information theoretical approaches for deriving quantum mechanics one starts
from the states and the experimental procedures one can do in the lab in an
instrumentalist fashion (see for example Hardy’s famous 5 reasonable axioms
paper http://arxiv.org/abs/quant-ph/0101012).

The aim of composability approach and interpretation is much
larger however. No concepts should be assumed from the real world, and
everything should be fully derived mathematically in a very rigorous fashion.

One subtle point is that the actual representation of the
wavefunction’s abstract Hilbert space does not have any ontological interpretation.
The same abstract Hilbert space can be represented by very different mathematical
constructs which cannot be put into any isomorphism. The case in point is the
Dirac delta function which is a distribution and not a proper function. As a function
it has contradictory properties and to properly understand it, knowledge of spectral
theory is required.

The wavefunction is simply a mathematical non-contextual
(i.e universal) tool of computing probabilities. What have ontological value
are the operators and they correspond to actual experimental procedures in a
lab. Through channel-state duality the wavefunction indirectly acquires (elliptical)
ontological value too. This is the sense that the wavefunction can be called
elliptic-ontic.

For a one particle wavefunction (when the configuration space
becomes the space-time), the fact that the wavefunction is defined up to a
phase in a particular case can be understood as a gauge degree of freedom corresponding to U(1). The wavefunction in this case has the same ontological value as the electromagnetic
field. Now this may sound too good to be true, but it is natural in the context of C* modules (the natural generalization of C* algebras). In C* modules, the quantum mechanics number system are no longer complex numbers but C* algebras. Also the typical Hilbert C* module corresponds to vector bundles and the gauge group corresponds to the connection on the bundle. Expanding on this, in one case the Born's interpretation can be upgraded from probability density to probability current density and naturally arrive at Dirac's equation and the electroweak gauge symmetry SU(2)xU(1). This justifies the minimal coupling. The mathematical framework for this generalization is that of a spectral triple in non-commutative geometry. Classifying all possible realizations of the elliptic composability is a work in progress. Looks like nature picked one particular realization which gives rise to the Standard Model. Why this particular realization was picked cannot be answered in the composability framework and additional ideas are needed.

7 comments:

hey nice post meh, I love your style of blogging here. this post reminds me of an equally interesting post that I read some time ago on Daniel Uyi's blog: Reasons Why You're Still Not Dating Girls .keep up the good work friend. I will be back to read more of your posts.

A bit of history seems out of line. The Werhmacht got within sight of the Kremlin spires in November/December 1941. The Russians managed to stop them at that point. In 1942 the German began their offensive against the Don River and Caucus region. The greatest asset the Russian and allies in general had in WWII was that Hitler was a complete moron; he only had a certain political smarts and a bizarre genius for manipulating large masses. Other than that he was a complete numbskull. Stalin was no military genius either, but Zhukov managed to get him to relinquish control of the war situation. By contrast Hitler took more control of the situation as time went on and made a deeper mess of things for the German situation.

The elliptic and parabolic conditions seem to mark something that is similar to extremal conditions in general relativity. There is also a question I have in mind concerning what the parabolic condition has with relationship to the Heisenberg group. The Heisenberg group is nilpotent and a Borel group of upper right triangular matrices. The parabolic condition appears to have some relationship to the central element of the Heisenberg algebra. Of course in your work the parabolic case is the classical condition, which I think is similar to the extremal black hole condition, which has T = 0 and is eternally stable with no quantum emissions or Hawking radiation. This is a sort of ideal limit, suggesting that classical mechanics is a sort of ideal limit that is not completely realistic physically.

I have it in mind of course that quantum entanglement builds up spacetime structure. In particular it builds up event horizons and the loss of entanglement across them as can be observed by an exterior observer. . If you have a pure entangled system the information between a and b in regions A and B is

I(a,b) = S(a) + S(b) - S(a∪b)

The information is always positive and and vanishes only if the density matrix for a and b is a direct tensor product (non-entangled) so that S(a∪b) = S(a) + S(b). This means the mutual information between two regions decreases as their entanglement vanishes. The mutual information for operators in these two regions A and B for systems a and b is then

As the regions A and B become decorrelated the expectation \langle O_bO_a\rangle approaches zero. This overlap will decrease due to the formation of an event horizon. The occurrence of an event horizon is related to the existence of a Bogoliubov metric coefficient so that \langle O_bO_a\rangle ~ e^{-s}, where s is geometrically equivalent to a proper interval in geometry.

I have for a long time thought the wave function was a matter of ontological relativity. The wave function is not completely real in a classical sense; if nothing else it is complex. It is usually regarded as an epistemological entity that holds all possible information that can be obtained under all possible sets of measurements. The wave function is then regarded as something that informs or confers knowledge and not something which exists. However, if wave function collapse is something itself subjective then this perspective can’t be entirely correct. There is then some sort of relativity between ontology and epistemology with respect to wave functions.

The wavefunction is probably best described as a mathematical tool. The case for ontology is weak and indirect, and the case for the epistemic interpretation is blocked by PBR theorem and by the impossibility of collapse.

In the next post I'll expand on Hardy's 5 reasonable axiom paper, and its continuation in the work of Jochen Rau. Then I'll present upon Grgin's quantionic quantum mechanics and Zovko's interpretation of the wavefunction as a probability density current. I'll also touch on quaternionic quantum mechanics. All those example will help with a better understanding of quantum mechanics.

it seems that scientists are able to maintain the coherence of a quantum bit without causing a sudden collapse of the wavefunction. I'm not a physic and I could not be able to interpretate such results in the right manner. I wonder if such findings could rule out New Age Quantum Mysticism interpretations, wherein consciousness causes collapse and is not considered as separated entity from the phisical world

The paper from Nature is about weak measurements. This is an interesting topic and probably I should explain it in future posts.

Regarding the relationship between quantum mechanics and consciousness, there is none, despite serious efforts to prove otherwise.

Late Sidney Coleman once told a story about him and Yakir Aharonov on this topic. If consciousness causes the physical effect of wave unction collapse, what happened before there were humans in the universe? So Aharonov asked Coleman: "tell me: did your father collapsed the wavefunctions before you were born?"