Friday, May 16, 2014

Steven Weinberg's mutated density matrices

CLOUD: First, off-topic news. CLOUD at CERN published some new results. Previously, they showed that the sulphur acid doesn't seem to be enough to produce more clouds. Now, if the acid is combined with oxidized organic vapours from plants etc., low-lying clouds may be created and cool the planet. The cosmic rays may still modulate and matter a lot but only if the concentration of the two previous ingredients is low enough for the cosmic rays to be helpful.

The title is a bit misleading. What Weinberg mostly tries to do is to study possible symmetry transformations of density matrices that don't descend from transformations of pure states. This blog post has two comparable parts. The first one is dedicated to Weinberg's negative, knee-jerk reactions to the foundations of quantum mechanics. The second part is dedicated to his hypothetical non-pure-state-based density matrices.

Anti-quantum noise as an introduction

The beginning of the paper reflects Weinberg's personal dissatisfaction with quantum mechanics.

Two unsatisfactory features of quantum mechanics have bothered physicists for decades.

However, there are no (i.e. zero) unsatisfactory features of quantum mechanics so what has bothered the physicists were inadequacies and stubbornness of these physicists themselves, not unsatisfactory features of quantum mechanics.

The first is the difficulty of dealing with measurement.

There is no difficulty of dealing with measurement. On the contrary, measurement – or observation or perception (these two words may ignite different emotions in the reader's mind but the physical content is equivalent) – is the process whose outcomes quantum mechanics is predicting, and it is doing so probabilistically.

Weinberg continues:

The unitary deterministic evolution of the state vector in quantum mechanics cannot convert a definite initial state vector to an ensemble of eigenvectors of the measured quantity with various probabilities.

It cannot and indeed, it doesn't and shouldn't. The unitary deterministic evolution is just a time-dependence of the state vector or the density matrix or the operators whose purpose is to calculate the probabilities of measurements. And one must know what the questions are before he demands that a quantum mechanical theory calculates the answers for him.

Later, the paper discusses this point in detail and I am sure that as soon as Weinberg gets sober, he agrees with me. By the ensemble, he really means a particular decomposition of a density matrix as\[

\rho = \sum_i p_i \ket{\psi_i}\bra{\psi_i}

\] into state vectors and probabilities. If we allow \(\ket\psi\) not to be orthogonal to one another, \(\rho\) may be expanded in infinitely many ways according to this template. But it's obvious that a particular choice to decompose the density matrix is unphysical. All predictions about the physical system are encoded in the density matrix itself. Weinberg himself admits this trivial fact – if the choice of the decomposition were real, its change could be used to sent the information superluminally.

Weinberg sort of suggests that he discovered that only the density matrix matters (while the particular decomposition does not). Sorry to say but at least in Prague (and I guess that almost everywhere else), those things were taught as the elementary stuff to the undergraduates and it was correctly claimed that these basic constructions and interpretations of the density matrix were pioneered by the fathers of the density matrix – Felix Bloch, Lev Landau, and John von Neumann. It's very clear why only the density matrix and not some decomposition matters: everything that quantum mechanics may predict are the probabilities and all of them may be calculated via formulae that only depend on \(\rho\) and not some "finer information" about \(\rho\) such as a decomposition.

The same comments apply to the state vector. If a physical system is described by the state vector, all predictions – the probabilities – may be calculated from the state vector itself, so e.g. a particular decomposition of the form\[

\ket\psi = \sum_i c_i \ket{\phi_i}

\] can't possibly be physical. The decomposition of a pure state to the eigenstates of a particular observable is more directly useful for those who are just planning to measure this observable. But that's it. A decomposition may be more useful than another one; but it cannot be "more right".

Now, the density matrix is a more general object than the state vector. The state vector describes the state of a physical system of which we have the "maximum knowledge" allowed by the laws of quantum physics. It typically means that we have measured the eigenvalues of a complete set of commuting observables. Even with this "maximum knowledge", i.e. a state vector, almost all predictions are inevitably just probabilistic. This directly follows from the uncertainty principle. In the case of the maximum knowledge, all the predictions may still be calculated from the density matrix\[

\rho = \ket \psi \bra \psi

\] using the same formulae we are using for the most general density matrix. None of these claims is new in any way. All of them were fully understood in the late 1920s, undergraduate students of quantum mechanics should understand them in the first semester, and Weinberg or other contemporary physicists shouldn't try to take credit for these things.

Here we seem to be faced with nothing but bad choices. The Copenhagen interpretation assumes a mysterious division between the microscopic world governed by quantum mechanics and a macroscopic world of apparatus and observers that obeys classical physics.

Like always in science, we are facing both good choices and bad choices. Quantum mechanics – as explained by the "Copenhagen interpretation" – precisely formulates what the laws of physics may do and may not do, what is physical and what is unphysical. Only results of measurements are real facts and the laws of physics may calculate the (conditional) probabilities that the observations of a certain property or quantity will be something or something else if the results of previous measurements were something or something else.

This basic paradigm – that only observations are meaningful and only probabilities may be predicted – isn't open to "interpretations". These conceptual assumptions are the postulates of quantum mechanics – the physical theory, not a direction in philosphers' or artists' babbling – in the same sense as the equivalence of different inertial systems is a postulate of the special theory of relativity. And indeed, every good "modern" way of talking about quantum mechanics – consistent histories, quantum Bayesianism, and perhaps others – agrees with these basic pillars of modern physics. Every wrong way of talking about quantum physics – deBroglie-Bohm theories, many worlds, and Ghirardi-Rimini-Weber or Penrose-Hameroff kindergarten-like-real collapses, among others – try to deny that physics has irreversibly switched from the classical foundations to new, quantum foundations.

Quantum mechanics – and by that, I mean what the heroic Copenhagen folks discovered and what is studied by those who respect the basic foundations of quantum mechanics, not the wrong, ideologically motivated pseudoscientific delusions about what quantum mechanics "should be" – doesn't introduce any mysterious division between the microscopic world and the macroscopic world. Instead, all objects in the world, whether they are microscopic or macroscopic, obey the laws of quantum mechanics. This fact has been known since the 1920s, too. In fact, people have developed the quantum theory of crystals, conductors, gases (including Fermi-Dirac and Bose-Einstein statistics), paramagnets, diamagnets, ferromagnets, and other macroscopic materials and objects in the late 1920s and early 1930s. It is absolutely ludicrous to suggest that quantum mechanics has any problem with macroscopic objects.

What is true is that for large enough objects, classical physics works as well in the sense that it is a qualitatively good approximation of quantum mechanics. In the limit \(\hbar\to 0\), quantum mechanics may generally be approximated by a classical theory. One must still realize that quantum mechanics is always right and always exact while classical physics is only sometimes right and it is only approximately right: it is a limit of quantum mechanics. This limiting procedure has many aspects and implications. The existence of the classical limit is needed for nearly classical observers like us to be able to talk about the predicted observations and their probabilities using the same language that is used in classical physics (a fact – something has happened or not – means exactly the same thing in quantum physics as it does in classical physics; probabilities mean the same thing in quantum mechanics and classical physics, too; only the quantum mechanical rules that allow us to deduce that something will probably occur out of the knowledge of something else that has occurred in the past are different than they are in classical physics).

Decoherence is a calculable process – caused by sufficiently strong interactions of the object of interest with sufficiently many degrees of freedom in the environment – by which the information about the relative phases encoded in the state vector or, more generally, about the off-diagonal elements of the density matrix in a basis (one that ultimate agrees with – and defines – the common-sense decomposition) is rapidly being lost. That process implies that the probabilities encoded in the density matrix may be approximated by classical ones because the potential for characteristic quantum phenomena in the future – in particular, re-interference of parts of the state vector – has been practically lost.

But decoherence doesn't weaken quantum mechanics in any way; it is not an addition to quantum mechanics that has to be made to fix a "bug" in quantum mechanics. Quantum mechanics has no bugs. Decoherence is a consequence of the laws of quantum mechanics that justifies the classical reasoning as an approximation in certain situations. You may still insist on the exact quantum interpretation of all the probabilities, however! None of these insights weakens the fact that quantum mechanics is a perfectly and exactly valid theory of the microscopic world as well as the macroscopic world. Niels Bohr and Werner Heisenberg not only knew it but they helped to lay the foundations of the actual modern quantum mechanical theories of many macroscopic objects and phenomena. It is a disrespectful untruth for someone to suggest that something fundamental was missing in the Copenhagen school's description of macroscopic objects. This untruth has almost become the consensus of popular books on quantum mechanics which doesn't make it any less outrageous. Such attacks on the universal validity of quantum mechanics are as outrageous as the attacks against heliocentrism voiced 90 years after the Galileo trial.

If instead we take the wave function or state vector seriously as a description of reality, and suppose that it evolves unitarily according to the deterministic time-dependent Schrödinger equation, we are inevitably led to a many-worlds interpretation [2], in which all possible results of any measurement are realized.

There exists no sequence of logical arguments with reasonable assumptions that would imply that "all possible results are realized". And there doesn't even exist any "theory" that would describe at least basic features of the world around us in agreement with the paradigm that "all possible results are realized". Every time someone "concludes" that all results are realized, the conclusion follows either from sloppy and circular thinking, cheating, a brain defect, or a combination of these three reasons. Weinberg derives many things as a real perfectionist so it's due to a non-Weinbergian trait when he suddenly claims that something clearly invalid and indefensible may be derived from something else (which is also wrong, but in a different way) – that we are "inevitably led" somewhere. We are surely not. If there were a glimpse of an argument that makes sense, Weinberg would show it instead of screaming words like "inevitably".

These comments about the non-existence of the "logical derivation" are not too important because the assumption, the claim that "the state vector describes the objective reality", is demonstrably wrong.

To avoid both the absurd dualism of the Copenhagen interpretation and the endless creation of inconceivably many branches of history of the many-worlds approach, some physicists adopt an instrumentalist position, giving up on any realistic interpretation of the wave function, and regarding it as only a source of predictions of probabilities, as in the decoherent histories approach [3].

The "instrumentalist" position is exactly what the founders of orthodox quantum mechanics (Copenhagen school: Bohr, Heisenberg, Jordan, Born, Pauli, Dirac, and so on) would defend. They may have called it a "philosophy", and a "positivist" one, not an "instrumentalist" one, and the wordings and sociology may have changed but the physical content is exactly the same. The content is also the same as the content of the slogan "shut up and calculate". You just shouldn't insist on talking about things that cannot be measured. You may talk about them but it is perfectly fine for a theory to declare some or all unmeasurable questions to be physically meaningless and a person criticizing a theory for its not talking about unmeasurable things is simply not acting as a scientist!

In my opinion, this is the most correct interpretation of the positivist/instrumentalist/shut-up-and-calculate attitude and the general philosophy of this attitude was really uncovered by Einstein's thoughts about relativity, too. At least Heisenberg would always credit Einstein with bringing this general positivist philosophy to physics. Einstein realized that a theory doesn't have to define the objective meaning of the "simultaneity of two events" because there exists no objective instrumental test of whether or not two distant events occurred simultaneously. The only difference between relativity and quantum mechanics is that relativity only declared a small number of things "subjective" (well, observer-dependent) and people got used to it while many things remained Lorentz-invariant. Quantum mechanics makes every and any knowledge fundamentally subjective but the logic why it's fine is qualitatively the same as it is in the case of the simultaneity of events in relativity! The fundamental universal reason why it's fine for a theory to declare things subjective, i.e. observer-dependent, is that an observer is needed to make observations (sure!). So in general, every part of the observation may depend on the observer. There are things that the observers will ultimately agree upon (Will a fast broom be caught in a barn in relativity? Has the Schrödinger's Cat started a nuclear war a minute ago?) but the agreement may be a nontrivial derived fact while the intermediate steps in the derivations may be different for different observers. The agreement between different observer doesn't have to be and isn't due to the fundamentally and exactly objective character of almost everything in the world!

The other problem with quantum mechanics arises from entanglement [4]. In an entangled state in ordinary quantum mechanics an intervention in the state vector affecting one part of a system can instantaneously affect the state vector describing a distant isolated part of the system.

Entanglement isn't a problem. Entanglement is the generic as well as the most general quantum description of correlation(s) between two subsystems. Almost all states are entangled i.e. refusing to be tensor-factorized to independent states of the subsystem. Quantum mechanics would be reduced to "nothing" or would lose its "quantumness" if entanglement were "forbidden". The predictive interpretation of the entanglement is exactly the same as the predictive interpretation of correlations in classical physics. But quantum mechanics and its entanglement may actually imply predictions that can't follow from any classical model – like simultaneously guaranteed correlations in many pairs of quantities, high correlations violating Bell's inequalities, and other things. But that's simply because it's a different theory. The class of classical theories may have looked large to many people but it's too small and constraining for the correct theories of Nature and the correct theories of Nature are quantum mechanical and refuse to belong to the classical class!

It is true that in ordinary quantum mechanics no measurement in one subsystem can reveal what measurement was done in a different isolated subsystem, but the susceptibility of the state vector to instantaneous change from a distance casts doubts on its physical significance.

It doesn't just cast doubts. It proves that the state vector – or the density matrix – can't be viewed as an objective feature of reality. Indeed, the wave function may rapidly change and if it were a piece of the objective reality, this "collapse" would be in conflict with relativity. But the actual, correctly interpreted quantum mechanics has no problem with relativity. It may be compatible with relativity and indeed, quantum field theory and string theory are guaranteed to be compatible with relativity.

The state vectors or density matrices are data summarizing the subjective knowledge about the physical system. And the "collapse" is nothing else than the subjective process – taking place in the brain – that allows us to replace the original complicated probability amplitudes encoding distributions of all quantities by the conditional probability distributions in which the already known outcomes of measurements are taken into account as facts.

I could go on for a while. Although Weinberg avoids writing "clearly and atrociously wrong" claims about the foundations of quantum mechanics that others like to produce, I don't really feel comfortable with a single sentence he is writing about the right interpretation of quantum mechanics, its applicability, or its history.

Generalizing transformations of density matrices

But these provocative comments about the foundations of quantum mechanics are not supposed to be the key content of the preprint, I guess. Instead, Weinberg wants to generalize symmetry transformations that may apply to density matrices.

The basic "modest proposal" is that the density matrix is more fundamental than a state vector. Well, I agree with that, kind of. The pure density matrix\[

\rho = \ket\psi \bra\psi

\] is a special case of the density matrix corresponding to the "maximum knowledge". Note that the overall phase of the state vector \(\ket\psi\) does not affect the predictions because the phases cancel in \(\rho\) and all predicted probabilities may be calculated using this \(\rho\). While the state vector (pure state) may be viewed as a special example of a density matrix, it is sort of sufficient, too, because the most general density matrix may be written as a mixture (a real linear combination) of squared pure vectors with some probabilities as the coefficients (see the formula at the top and use it to diagonalize the density matrix; the vectors on the right hand side will be orthogonal to each other in this case). The probabilities predicted from a density matrix are therefore weighted averages of the probabilities predicted from pure states – the weighted averaging is no different than in the corresponding classical calculation. For this reason, the "quantum essence" of the predictions is hiding in the pure states and the density matrices may be – but don't have to be – considered an "engineering addition" added on top of the calculus of pure states. The probabilistic ignorance from both sources (the unavoidable uncertainty hiding already in the pure states; and the probabilistic, classical-like mixing from the density matrices) gets mixed up and both types of ignorance may be treated together using the natural and simple formalism of density matrices which is why it's totally OK to think that the equations involving density matrices are "fundamental".

What does it mean to have a symmetry in quantum mechanics? We mean a linear operator \(U\) that acts as\[

\ket\psi \to U \ket\psi

\] Simple. The corresponding bra-vector \(\bra \psi\) is just the Hermitian conjugate so it transforms accordingly:\[

\bra\psi\to \bra\psi U^\dagger.

\] Because the density matrix is a combination of \(\ket\psi\bra\psi\) objects, it transforms as\[

\rho \to U \rho U^\dagger.

\] Great. If the density matrix as well as operators \(L\) transform by this conjugation and if \(UU^\dagger=U^\dagger U = 1\) i.e. if the transformation is unitary (linear and preserving probabilities), then the expectation values\[

{\rm Tr}(\rho L_1 L_2\dots )

\] are conserved because \(U^\dagger U\) cancel everywhere including the beginning and the end (due to the cyclic property of the trace). All predicted probabilities may be written in this form as well, with \(L_i=P_i\) chosen as some projection operators on the Yes subspace of the Yes/No questions, so the predicted probabilities may be invariant under the symmetry transformations, too.

(Consistent histories work with a mild generalization of this formula for probabilities in which we consider probabilities of whole histories i.e. traces \[

{\rm Tr}(P_n\dots P_2 P_1\rho P_1 P_2 \dots P_n)

\] involving traces of products of the density matrix and a multiplicative sequence of several projection operators encoding different properties at different times. For the different histories to be mutually exclusive in the classical sense, we demand a sort of "orthogonality" consistency conditions for these pairs of histories. The consistent histories aren't really quite new; they're just the normal Copenhagen formalism adapted to composite questions about several properties of the system at different times.)

The main technical question that Weinberg is addressing in the paper is whether there may be transformations of the density matrix\[

\rho \to g(\rho)

\] that are not descendants of transformations of the pure state vectors\[

\ket\psi \to g(\ket\psi ).

\] If we assume that the transformations are linear in the matrix entries of \(\rho\), the transformations on the \(N\)-dimensional Hilbert space are pretty much elements of the \(U(N)\) group. But the density matrix has \(N^2\) different real parameters (if we allow the trace to be anything) – it would be \(2N^2\) if the entries were complex but the Hermiticity reduces the number of parameters exactly to one-half. And Weinberg and others could think about transformations that may mix these \(N^2\) entries in more general ways than ways descended from the pure state vector transformations i.e. different from \(\rho\to U\rho U^\dagger\). In other words, he wants to talk to the matrix entries of the density matrix directly, i.e. via \(U(N^2)\) transformations of a sort (or some useful subgroup that acts on the entries differently than the action descended from the \(U(N)\) transformations of the pure states).

This is a potentially interesting business of looking for loopholes and exceptional structures.

For the time evolution (i.e. the transformation by time-translations), there is a well-known generalization of the "normal" transformation given by the Lindblad equation\[

\] The commutator term describes the "normal continuous differential evolution" of the density matrix that is derived from the Schrödinger's evolution of the pure state vector. The terms involving the mutually orthogonal operators \(L_m\) are new. Such a form of the time evolution may be obtained for open systems, i.e. from tracing over some environmental degrees of freedom. When you do it in this way, the evolution is naturally irreversible, time-reversal-asymmetric, and that's why Weinberg talks about the semi-group structures (a semi-group is almost like a group but the inverse element isn't required; for example, the group of renormalization group "flows" is really a semi-group because the "integrating out of the degrees of freedom" is irreversible).

At a fundamental level, one expects the "normal transformation" of the density matrix to be the only physically kosher one and there are partial theorems that Weinberg acknowledges. But he is looking for loopholes. The Lindblad equation is one such loophole.

But he wants to focus on more exotic, exceptional, cheeky ways to access the individual matrix entries of the density matrix. His first provocative example is\[

\] where \((b_1,b_2,b_3)\) are supposed to transform as a complex triplet under an \(SU(3)\) group while \(a_1,a_2,a_3\) are three real singlets, not transforming at all. Note that these \(SU(3)\) transformations are preserving the trace (well, even the individual diagonal entries) and the Hermiticity of the density matrix.

This \(SU(3)\) action on the density matrix is different from the descendant of the usual \(U(3)\) action on the pure states in the Hilbert space. Why? Well, under the \(SU(3)\) subgroup of the latter, the density matrix transforms as the adjoint i.e.\[

{\bf 3}\otimes \overline{\bf 3} = {\bf 8} \oplus {\bf 1}.

\] On the other hand, Weinberg's proposed mutated density matrix transforms as another 9-dimensional representation, namely\[

\] There's no adjoint representation here at all. OK, in the group-theoretical terminology that Weinberg seems to avoid for unknown reasons, we may ask with him: Are there some interesting physical models in which the density matrix transforms differently than in the adjoint representation of the \(U(N)\) symmetry acting on the Hilbert space (into which all the symmetry transformations are normally embedded)?

There are tons of reasons why the usual "adjoint representation option" is the only physically meaningful one, and Weinberg describes several of them. For example, the final portion of the preprint is dedicated to positivity – to the requirement that the general mutated transformations of the density matrix must preserve the non-negativity of its eigenvalues. Also, it's obvious that the "adjoint representation option" is the only possible one if we allow all operators, including all projection operators on arbitrary pure states, and we require the traces \({\rm Tr}(\rho L)\) to be conserved by the symmetry transformations.

But it seems to me that Weinberg doesn't articulate the most obvious reason why we want to insist on the "adjoint representation option": the trace \({\rm Tr}(\rho L)\) is contracting the operator \(L\) with something else, so this something else should mathematically transform as an operator, too. Otherwise there are no nontrivial singlets in the bilinear product.

The density matrix isn't really an observable – the probabilities can't be measured by a single measurement – but in some sense, it is the "operator of probabilities" (the eigenvalues are probabilities of the corresponding eigenstates, if we choose this basis) and it must transform in the same way as operators. Observables must transform in the adjoint because they are operators, stupid. We're supposed to know how to multiply them, in an associated way, so they're really matrices or generalized, infinite-dimensional matrices of a sort. Observables have transformation properties "derived" from the pure states because they may be defined as something that depends on the pure states. Another interesting issue is whether the "algebra of observables" has a preferred representation. In the simple models like non-relativistic quantum mechanics and quantum field theory (and therefore string theory in well-known backgrounds which admit a quantum-field-theory-based description), we're used to the "Yes" answer, at least morally. In the quantum mechanics generalized in the way I still count as quantum mechanics, the answer is demonstrably always "Yes"; the space of pure states is "canonical". Note that this space is large and unifies the spaces with all eigenvalues of whatever you could consider "Casimirs of an algebra". If we talk about quantum fields etc., we are considering all operators and their products, not just a limited set of symmetry generators.

One may weaken the requirement that all these things are in the adjoint representation in some way and try to look for exceptional solutions (loopholes) to these weakened requirements, and that's what Weinberg is doing. But at the end, or at the beginning, he should have asked what are the broader rules of the game or the motivation behind this whole business. Of course that if one weakens some postulates sufficiently, i.e. doesn't require the operators and/or density matrix to transform in the adjoint, there will generically be new solutions to the weakened constraints. But we must ask:

Are these new solutions physically relevant for our world or worlds that enjoy at least some qualitative kinship to our world (e.g. some highly exotic string vacua)?

Are these new solutions mathematically interesting so that these new non-adjoint exceptions are exciting to be studied for mathematical reasons?

I would say that if the answers to both questions were "No", then the research of these exceptions would be pretty much worthless. If it is not worthless, which of these two questions is answered by a "Yes"? Maybe both questions are answered by a "Yes"? That would be thrilling, indeed. The answer to the first question is more likely to be "No" and the "Yes" answer would be shocking but there may be something new waiting over there.

If the answer to the second question is "Yes", then these new solutions could be analogous to the exceptional Lie groups. Someone could think that \(SU(N),SO(N),USp(2N)\) are the only compact simple Lie groups. But there actually exist the exceptional groups \(E_6,E_7,E_8,F_4,G_2\), too. We could have overlooked them but we may find them if we're careful, too. Similarly, all the transformations on the space of density matrices is being typically embedded into \(U(N)\) by assuming that the density matrix transforms in the adjoint representation. But it could perhaps transform as another representation of the group, perhaps a different group than \(U(N)\).

If such an interesting exception exists, the formulation of the "theory" using these mutated density matrices must forbid pure states. The "theory" would only work in terms of the density matrices. Is that possible? One thing to notice is that it must be impossible to fully identify a pure state by measuring a complete set of commuting observables. Pure states just shouldn't be allowed – otherwise the "theory" would have to tell us how the pure states transform as well, and the density matrices' transformation laws would have to be derived from that.

What does it mean that pure states aren't allowed in the theory? It means that there is no "classical-like knowledge" in the theory. Creatures living in that theory can't ever be 100% certain about pretty much anything. Their freedom to measure the observables (general operators/matrices on the Hilbert space) is fundamentally restricted in some universal way. If they were certain about something, that they have a pure state, then the pure states would probably be back in the game. So yes, I think that the existence of the classical limit of the usual sort also forces us to admit the usual "adjoint representation option" for the density matrices. Yes, I tend to think that in our world, at least e.g. in an \(n\)-qubit quantum computer embedded into the real world, it's possible to design a (usually complicated, composite) procedure to measure an arbitrary observable (given by any matrix on the \(2^n\)-dimensional Hilbert space). Such a procedure would have to be banned in "Weinberg's realm of loopholes". To avoid direct contradictions with the engineering tests, with the ability of quantum computation experts' to measure almost anything, the observables that may be measured in Weinberg's realm of loopholes, at least approximately, should be at least slightly "dense" in the space of operators.

But cannot there be something that is physically "close" to the adjoint representation option but is fundamentally different? Maybe it could describe the world around us, too. Let me tell you a scenario that I can prove to be impossible but for a while, at least if you are just smoking marijuana, you could think that it is an ingenious idea. Maybe the density matrix transforms as a large irreducible representation of the monster group and physical symmetries we know are only approximated by transformations embedded into the monster group!

Again, I can show that our world can't be like this particular proposal – and no world that looks like a "related" vacuum (e.g. other conventional enough vacua of string theory) can behave like that, either. (The monster group is relevant for the quantum description of all black holes states in the maximally curved \(AdS_3\) background of pure 3D gravity, as Witten has argued, but I think that this theory still allows arbitrary pure states and respects the decomposition of density matrices to pure states; maybe there's some natural way to restrict allowed values of the density matrix to a rational subset, however.) But of course, it is conceivable that some overlooked scenario involving mutated, twisted, and strangely constrained density matrices exists. It is possible that this is a gem that is waiting to be discovered and one must weaken some assumptions or axioms to find it.

The existence of a research project often tries to promote its own importance and it's just illegitimate in the absence of evidence

However, as always, I think it's critically important not to degrade science to the industry of rationalization of a wishful thinking, a posteriori justification of some "cool" conclusions that are actually assumptions of the industry. I think that even the question whether there exists an interesting, at least remotely physical, mutated formalism for non-adjoint density matrices is a scientific question that must be approached rationally and scientifically. Scientists must compare the evidence in favor and against the answer "yes, such an interesting generalization exists".

And for me, i.e. as far as I can evaluate the available evidence including the newest paper by Weinberg, the odds are way over 99.7% that such an interesting generalization doesn't exist. I am less certain about this answer than about the claim that "Bohmian, real many worlds, and objective collapses as a rewriting of quantum mechanics will always remain stinky piles of šit" – but I am still sufficiently certain that I would be willing to bet one million crowns on that assuming that the criteria of the bet would be sufficiently "objective".

60 comments:

Your entire diatribe rests on your assumption that the wave function simply encodes probabilities and has nothing further to say about objective reality. This is a reasonable, and perhaps correct, interpretation, but it is by no means indisputable correct. It is perfectly reasonable to believe that the wave function somehow also encodes something about objective reality. While the former is certainly more natural for high energy Physics, the latter is definitely more appropriate for Atomic or Solid State Physics. There are plenty of molecular phenomenon that rely on one single electron being at all possible positions simultaneously. If the electron was really only at one particular spot, we would get a different answers in subsequent measurements. Hartree-Fock, Exchange-Correlation, etc etc. Molecules and bubble chambers look very different.

It is remarkable that a guy as smart as Weinberg can be so confused about QM. Of course the world is quantum mechanical and QM is the only formalism that can ever describe it accurately on any scale, large or small. Classical mechanics may give an adequate answer to a practical problem but so may a simple guess. That does not make it science.

"...your assumption that the wave function simply encodes probabilities and has nothing further to say about objective reality."

The fact that the wave function or density matrix only encodes our knowledge and no objective reality is not "my assumption". First of all, it's not mine - it's due to the discoverers of quantum mechanics like Heisenberg, Jordan, Born, Bohr, and others. Second, it's not an assumption, it is the *result* i.e. major conclusion of years-long research which was the most important research in science at least in the last 150 years. Science works by falsification and 90 years ago or so, it simply falsified the idea that the world is described by classical i.e. objective reality. What is falsified can't be unfalsified - falsification is really irreversible. The only thing you may do is to deny science because you find it inconvenient - much like some people deny that the Earth isn't the center of the Universe and we contain DNA.

Lubos, thanks for this discourse on QM, very deep and provocative for sure. I’m far from a QM expert but I’m currently rereading Messiah’s two volume QM classic and your essay surely fits seamlessly with his take.

One thing I do consider hopelessly muddled is the multi-world interpretations of QM probabilities. I’ve worked with some very good physicists, one actually a student of Weinberg’s, and I’ve been struck by how few of them had any expertise in measure theory, leaving their understanding of probability a little hollow. The (possibly unique) thing about QM is that the “experiment” yields exactly the numerical values of interest (the observable), and this means that the measurable function connecting Nature’s uncertainty to our quantifications is the identity function. Statistically, then, there’s nothing to interpret, your done and stuff like Bayesian musings are basically irrelevant. Statistics is totally subordinate to probability and it only arrises when the measurable function connecting uncertainty and quantifications is unknown (consider an agricultural experiment where weights or lengths are recorded). The totality of measurable functions yielding finite variance (i.e. L2) form a Hilbert space and almost all statistical methods are simply finite dimension projections onto “nice” subspaces admitting convenient parameterizations (i.e. likelihoods). None of this occurs in QM because the experimental results are the values of interest, so, indeed, shut up and calculate.

Thanks, Gene, and a good point. Quantum mechanics with its new features is the only right way to describe any object and phenomenon in the world at the accuracy of 20th century science or later. The efforts to return the description to the straitjacket of classical physics are completely analogous to a farmer's attempt to explain all the observations using the agricultural common sense for which even classical mechanics may be annoyingly abstract and hard. ;-)

If one requires a certain level of accuracy, one simply has to get used to some features of the thinking of the description that would be annoying at a lower level.

Good to hear that my take on quantum mechanics agrees with Jesus Christ. ;-) His book is probably among the canonical targets of the "anger of the interpreters".

The "literal" many-worlds-advocates' interpretation of probabilities is the same as Walter Wagner. It's the guy who said that the LHC would probably destroy the world - 50% probability - because it either will or it won't. There are two possibilities and 100/2=50, so each of them has 50%. The host said that he was not sure whether that's how probabilities work. But that's exactly how they work according to the many-worlds-advocates' picture and they don't seem to care at all that the main and only thing that physics predicts - probabilities - are replaced by some universal completely wrong blanket fractions. They don't know how to fix this 50%-50% prediction and they don't seem to care because they don't seem to care about the theory's ability to predict anything.

The only meaningful known way to talk about an electron being "in all possible positions simultaneously" is by accepting the superposition of amplitudes in quantum mechanics. Is that what you mean by "objective reality" of the wave function? If so, superposition is a standard feature of standard quantum mechanics and a particle physicist could no more make a living without it than an atomic physicist :).

As somebody else said earlier, imho you are still an idiot. Note that for drawing this conclusion, it is enough to read the first sentence of your comment. Andas your quantum state seems not to evolve at all, I will not have to read a single character of what you write in the future ...

Are you really, really sure there isn't more to it? I mean it's not exactly out and out intuitive like it should be, is it? One needs to be able to see these things. You know, like it's easy to see how billiard balls work.

For example, Newton's gravity is very accurate, yet fundamentally wrong.

QM will hold only over some regime. No matter where it fails, it means that all the QM math is nothing more or less fundamental than, say the classical solution of a N body Newtonian system. Useful to predict the path of a space probe, but useless as an insight into the underpinnings of nature.

Weinberg is skeptical of QM, looking for a failure point, which is an admirable thing to do, even if you hold the view that with QM 'this time its different'.

I understand what you are saying. I just don't think it is that cut and dry. The key words you are using are observation and measurement. And yes, in that case, the Uncertainty Principal is relevant and blah blah blah... no need to rehash undergraduate material. And yes, I agree that basis states are simply math. I can solve the Hydrogen atom using Hermite Polynomials and it has no physical meaning what-so-ever.

The complication(s) come when we do round-about measurements, like the double slit experiment, or measuring energy levels of single molecules. The latter depends on the point like electron being "smeared out" (that's a terrible choice of words... I know). In fact, for the calculation to work we should interpret the wave function as a charge density over all space.

I agree that when we make measurements the wavefunction is clearly a probability. But when we are not looking things are much less clear. This is believe is at the root of Weinberg's point.

Weinberg acknowledges as much at page 95 of his Lectures on QM. He says there is nothing absurd about the state vector being only a predictor of probabilities. He goes on though to lament the loss of realism. It is hard to live with no physcal states he says and that it is consistent but is disappointing if the state vector is not more . Sounds like he loves the classical world too much.

I agree we should look for the failure points. I do not agree that QM will fail. Newton is not wrong and does not fail for what it is intended. QM correctly answers certain questions. It will never be wrong when used for what intended.

Looking for a QM failure point is a fool’s errand, Tom. While Newtonian mechanics works only over a limited range of mass and velocity, QM has no such limits. It is complete and inalterable up to the Planck energy (and thus to the smallest meaningful dimension) and it is scaleable to the entire universe.

This sounds like pure dogmatism, of course, but it is so much more than that. There really is no conceptual way to modify QM. If you disagree the ball is in your court but you can’t do it either. Those who dislike QM really do feel uncomfortable with it but that has nothing to do with science. They are just uncomfortable; that’s all.Newton’s gravity, by the way, is not “fundamentally” wrong; it is not even a little bit wrong. It is, of course, incomplete as a theory of the world we live in. Over it’s wide range of application, it yields exactly the same answers as QM. It is not Wrong!

As a professional solid-state physicist myself I feel I must rebut your assertion that it is reasonable to believe that the wave function encodes something about objective reality. It does not and it is not reasonable to believe it does. There is not a whit of difference between the meaning of the wave function for high energy physics and for solid-state physics. I would also suggest that you avoid the use of the insulting word “diatribe” when assessing Lubos’ blog. He is right and you are wrong.

When you say QM is not intuitive you are really saying that it is not intuitive to you. After almost sixty years of intimate familiarity I assure you that QM is perfectly intuitive to me. The reason it is unintuitive to you is that you are trying to understand it in terms of the world that you already know. You can’t do that. In a sense you you have to start over and forget much of what you already “know” about the world. It’s not easy. I am of at least average intelligence and it took many years for me to get it.

This is kind of spooky. I was just making pretty much the same point about the interpretations of quantum mechanics here http://www.reddit.com/r/Physics/comments/24w5p7/wiki_declares_copenhagen_interpretation_is_the/chc7ru9?context=3. I even ended up using pretty much the same analogy with special relativity. Honestly, I don't even accept the idea that you have to give up "objective reality." You just have to give up a notion of objective reality that is close to our classical intuition.

Hi Anon, I upvoted most of your comments - are you the outer space potato man nine? ;-) Including the comment where you talk about "vitriol" on this blog. It's not vitriol, make a pH test!

I agree with you that the Born rule almost certainly won't be derived from something "qualitatively more fundamental". It's as fundamental as it can get. Classical physics was predicting classical quantities such as positions. Quantum mechanics is predicting i.e. calculating probabilities. It has to predict something that is measurable, a theory has to say what it is, and both classical physics and quantum mechanics give answers. So they're complete theories/hypotheses. On top of that, quantum mechanics is exactly correct, too.

Dear John, I am not "super quite sure" about anything but I am more sure about the fundamental role of probabilities in the exact laws of physics than I am, for example, about the existence of DNA as the carrier of the genetic information etc. Much higher than 99.9999%.

I haven't really seen DNA with my own eyes after an experiment that I have fully done, so there may be a conspiracy. On the other hand, I have verified or rediscovered the lines of evidence that make the probabilistic character of the laws of physics inevitable myself. A careful, rational thinking about the double slit experiment (or a careful thinking about any other simple enough and characteristically quantum setup) is really enough for that.

The trouble with clear explanations using simple language, such as the above, is that they can trick the ordinary reader into adopting a delusional belief that one has understood quantum mechanics.

The acid test comes when one tries to pass on that knowledge. That happened to me the other day, when I wanted to explain the uncertainty principle to a friend. Even though the friend was patient and receptive, I found myself tripping over words, mixing metaphors, making illogical leaps, and just embarrassing myself.

Well, the trouble is not really with the explanation. It is a commendable achievement to write simply and clearly.

(Another one who does an excellent job is Johannes Koelman. Hmm, a Czech and a Dutchman -- neither of whom "native speakers" of English -- are better writers than most English-only scientists? How to explain that LOL)

The trouble lies in not realizing that to explain something well, it is not sufficient -- not by a long shot -- to consume well-written texts as a reader, one should also invest the time to learn the algebra -- do the math -- and put what one has learned into one's own words... before attempting to teach others.

This means that the literal smell of pine forests is forming clouds as the highly volatile pinene and related terpene molecules raise into the atmosphere, and are oxidized by photochemically-generated hydroxy radicals to make poly carboxylic acids that can then form complexes with sulfate ions act as templates for water condensation into cloud water droplets. Here is a quick graphic of the chemistry:

http://oi61.tinypic.com/2cg0nz7.jpg

They ran mass spectroscopy to show how specific small complexes, likely hydrogen bonded, between a specific carboxylic acid laden molecule and some sulfates inside of their big CLOUD machine that simulates the atmosphere.

As far as I am concerned, the Uncertainty Principle can be intuitively extrapolated from fundamental physics into a for some encompassing philosophical tasks required-to-be-applied Tolerance Principled attitude.

Lubos and Gene (and people with similar points of view) seem to inadvertently apply this my tenuously derived principle in their understanding of and their attitude to QM and how it should be/is best interpreted. :->

A tolerance principled attitude may have to be deployed in order to facilitate the 'production, sales and consumption' of atheistic enlightenment promoting tools for thought;

Or, at least it can in my experience facilitate a science-aligned and conservatively revolutionary accEPTance of What Is and was going on - as seen mainly from a perspective of an effectively philosophy terminating evolutionary psychology type analysis of ourselves.

The state vectors or density matrices are data summarizing the subjective knowledge about the physical system. And the "collapse" is nothing else than the subjective process – taking place in the brain – that allows us to replace the original complicated probability amplitudes encoding distributions of all quantities by the conditional probability distributions in which the already known outcomes of measurements are taken into account as facts.

That it 'takes place in the brain', or is entirely subjective, would suggest that different experimenters in the same room could disagree on the measured value of some quantum mechanical observable, which does not appear to be the case.I suppose (but am not sure) that different observers could indeed observe uncorrelated collapse, but for that to happen the different subspaces that these observers reside in must be entirely orthogonal, which isn't really going to happen in practice. Being placed in a separate cat-box certainly wouldn't be enough to disentangle you from your immediate environment.

But being on another planet? Could that allow us to observe the subjectivity of collapse somehow? Being in another lab a few km away?

This is what I dislike about the decoherence framework; it sounds plausible, but nobody ever bothers making a quantitative prediction.

The most fundamental and yet practical attack on the fundamentals of QM was the EPR "paradox". It was shown to be 100% in agreement with QM. This implies that the fundamental axioms of QM must also be 100% correct to avoid contradictions with relativity. First: if one could influence the outcome of an experiment even by a tiny fraction, information could be exchanged faster than the speed of light. Second: relativity forbids the conclusion that one observer causes the collapse of a state before the other observer measures the state (time ordering is not relativistically invariant for space-like separated events) let alone that something happens simultaneously and it's out of the question that the system was in a definite state to begin with. So all these classical interpretations of QM are insane. You don't like it? Go somewhere else: http://www.youtube.com/watch?v=iMDTcMD6pOw

The most fundamental and yet practical attack on the fundamentals of QM was the EPR "paradox". It was shown to be 100% in agreement with QM. This implies that the fundamental axioms of QM must also be 100% correct to avoid contradictions with relativity. First: if one could influence the outcome of an experiment even by a tiny fraction, information could be exchanged faster than the speed of light. Second: relativity forbids the conclusion that one observer causes the collapse of a state before the other observer measures the state (time ordering is not relativistically invariant for space-like separated events) let alone that something happens simultaneously and it's out of the question that the system was in a definite state to begin with. So all these classical interpretations of QM are insane. Feynman: You don't like it? Go somewhere else!

Am Emily Powell from Canada I never believed in spell casters until my life fell apart when my lover of 4 years decided to call it quit. I was so devastated that i had an accident that left me bedridden. After 7 months of emotional pain and languish, a friend of mine introduced me to a certain spell caster, this was after I have been scammed by various fake spell caster. I was introduced to DR ONIHA ( A Spell Caster). In less than 12 hrs i saw wonders, my Lover came back to me and my life got back just like a completed puzzle... am so happy.. Dr ONIHA have all kinds of spells from pregnancy to love,from employment to visa lottery winning. He has spell to stop divorce,spell to make someone look attractive and others. here's his contact for serious minded people only, it might be of help....onihaspiritualtemple@yahoo.com. wow Dr.ONIHA...thanks am so grateful as you saved my life...,,,,,

The path to the next breakthrough in physics is not known, and the people who will blaze the trail have to trust their uncomfortable feelings, as there is nothing else to hold onto, by definition.

Physics today holds little respect for crazy theories - which stifles progress. In other words its easy to publish a 'me too' paper that adds some extra polish to some sub-sub-field, but next to impossible to publish ideas which are almost certainly wrong. Which has more value to society?

In the 'old days' communication between physicists was slow (weeks to months) and thus new ideas - which are necessarily rough - could be ruminated on for years. The internet provides us with a lock down of sorts - wrong ideas are quickly crushed by the global group.

One is free to come up with alternative to special relativity, but publishing that idea or even talking about it would get one blackballed from physics. So its actually not strange at all. Sad perhaps, but not strange.

We are in a place in theoretical physics where the technicians are in charge.

No... that's not really correct. Superposition is generally used to describe distinct states with distinct energies. Lubos keeps bringing up the density matrix, which is probably your source of confusion. It's probably cleaner to think about a single state.

Lubos is saying that when looking at a single state the wavefunction is simply describing the probability of finding the particle at some position or momenta. I am saying I agree, but only if we do a measurement. If no measurement is made - like the double slit experiment or interacting electrons in a molecule - then the wavefunction definitely encodes something about reality. The single electron interacts with itself because it is everywhere at once... the wave function is like a charge density or an envelope function for planewaves or some other such thing. It is not that cut and dry... and this is probably what Weinberg is getting at.

Yes, I'm outerspacepotatoman9. I probably should have just used the same handle here but I didn't think about it. Also, from my perspective your anger is righteous anger at the prevalence of all of this unnecessary confusion! Unfortunately, a lot of the people on the other side of this issue don't see it that way so I usually warn them so they don't get all defensive. Your blog posts provide some of the most thorough discussion of these points though.

Anyway, I completely agree with everything you said here. It's funny, when I have these conversations people who like many worlds often say something like "What you are suggesting is such a small difference from many worlds, wouldn't it just be simpler to accept many worlds and forget about the wave function collapse altogether?" Well yes, it would be simpler. The only problem is that it doesn't actually work!

I have never spoken of how QM should be interpreted, Peter; you are confused. People get off on the wrong foot by even trying to interpret it. To really understand QM one must not view it through some other looking glass such as classical mechanics or, even worse, "objective reality". The only way to see the actual face of QM is from the inside, by actually using it to solve real problems. If you go down that road for a few years the light will begin to dawn.

Wrong! The path to the next breakthrough in physics is known, Tom. It is called string theory, the only viable quantum theory of gravity. As David Gross has said many times, string theory is not wrong! Get used to it.

I wasn't talking about the superposition of states with different energies or about the density matrix; it was more basic. I was responding to your statement, "There are plenty of molecular phenomen[a] that rely on one single electron being at all possible positions simultaneously," which you stated in support of the idea that "It is perfectly reasonable to believe that the wave function somehow also encodes something about objective reality."

My point was that the fact that electron is in all possible positions simultaneously is encoded in the wavefunction being a superposition of basis states |x> localized at different points in space, and that no additional "reality" of the wavefunction is needed to explain molecular phenomena. I also stated that particle physics depends on this aspect of quantum mechanics just as much as the molecular physics, since you seemed to suggest that molecular physics needed something more from quantum mechanics ("reality") than particle physics.

"At a fundamental level, one expects the "normal transformation" of the density matrix to be the only physically kosher one and there are partial theorems that Weinberg acknowledges."Well, if one assumes that the environment has an infinite number of degrees of freedom (see also: http://en.wikipedia.org/wiki/Lindblad_superoperator and/or QFT) and that a perfect isolation of a QM system from it is never possible, from the operationalist's point of perspective, the Lindblad equation is as "fundamental" as it can be - and one has to live with a non-unitary time FAPP.Best.

I agree that in certain cases the additional Lindblad terms are negligible.

But I regard the coupling with the environment as the most general case. If you are in a situation where the additional terms are relevant and you want to do "better", using\eq{ \dot\rho&=-{i\over\hbar}[H,\rho]}only, you have to specify the initial conditions for the system + the environment which may be practically or even in principle impossible.

But even if one could sustain unitarity for some time, the very act of observation opens up the system to the observer and to the environment and one is back to Lindblad.

" ... the non-unitarity you are talking about does not appear fundamental." - What does it mean for something to be "fundamental" in physics ?

Very nice review of Weinberg’s paper,Luboš! Since Weinberg has done lot of brilliant work before,I am inclined to give him some benefit of doubt. It looks like this is unfinished work in progress. He is searching for alternatives. He firmlybelieves that there is a measurement-interpretation problem and does not believe in any known "interpretations” as he mentioned in his book on quantum mechanics. Then it is hard to imagine that he can overcome that by just changing it to an equivalent formalism of density matrix and not mentioning wave functions and state vectors. That would be like sweeping the dirt under a carpet. The dirt is still there, though not visible! So it may very well be that he is searching for alternatives to the usualSchrodinger equation, starting with novel forms of density matrix.

Your analogy with the early understanding of Newtonian physics sounds reasonable but it is deeply flawed. The absolute limits to what can ever be observed (anything that can possibly have a causal relation to us) in terms of size, mass, energy or time were completely unknown in those early days. Now these limits are understood and it is clear that modern physics covers the entirety of observation space. Since science is only about what can be observed the game is over and there is zero wiggle room for fudging or for finer approximations to string theory. There is no room for adjusting string theory; it is completely rigid. The same it true for ordinary QM, which is the correct theory when gravity is omitted from the formalism.

@Swine flu (your name is a killer!), Do you believe that the wavefunction describes infinite number of particles (in this reality) representing an electron in the hydrogen atom, as an example. If so I agree 110%.

Thanks for your patient and polite comment and for the inadvertently prompting me to try to explain what I meant, differently this time!

In contrast to your interpretation of my attitude to QM, I see myself as having a complete non-expert's respect and appreciation of your pragmatic attitude to QM. :-)

Here is my ~point~ put slightly differently:

As matter of principle, a person who strives [whether with the inertia of habit or the even greater 'inertia' of a by CURSES insidiously co-motivated addictive habit] to with increasing resolution perceive (with or without mathematics and with ordinary or extremely high intelligence) any aspect of What Is/was going on will eventually require some sort of 'tolerance principled' attitude to not end up performing or being preoccupied with an "exercise in futility" (or with producing something even worse).

Besides, I wonder if one could not rather legitimately blame a deficient capacity for self-observation, specifically not being able to observe oneself being in need of adopting a ~tolerance principled~ attitude, for a lot of what you and especially Lumo more easily perceive then can explain as essentially silly attempts to revise or try to discover some discrete and immutable mathematical truths behind the observed phenomenons described and predicted by means of quantum mechanical probabilities.