Matthew J. Donald

This is an edited and expanded version of the overheads and notes that I prepared for a seminar, given in Cambridge on 26th February 2007, in the Philosophy of Physics series organized by Jeremy Butterfield.

* * * * * * * * * * * * * *

Abstract:
Confused ideas about the weirdness of quantum mechanics have sometimes been blamed for the spread of anti-realist positions in philosophy. In this seminar, I shall re-examine the relation between realism and quantum theory. My goal is to argue that one can remain a realist in a reasonably familiar sense, while adopting a theory which amounts to a form of idealism. After sketching the abstract mathematical structure of quantum theory, I will introduce realism and consider some of its problems and some counter-arguments. Next I will look at why quantum theory needs an interpretation and at some of the features common to many proposed interpretations. Then I will discuss some of the gaps in decoherence theory, when it is considered as an interpretation of quantum theory, and I will end with a sketch of my own realist version of idealism in which the fundamental entities are structures which define minds, and the fundamental laws govern the stochastic developments of those structures.

Warning: In an academic paper, the purpose is to convince the experts — even when the author is alone in his expertise. The purpose of a seminar, however, is to give non-experts a rapid overview of some aspects of a subject. This means that the
arguments put forward in a seminar are almost always incomplete. A seminar for a mixed audience may include occasional comments directed at experts which most of the audience ought to ignore. The main goal for the audience, at least as long as they trust the speaker, ought be to try to understand the outlines rather than the details of the arguments. These notes are provided in this spirit.

* * * * * * * * * * * * * *

Realism, the Interpretation of Quantum Theory, and Idealism.

At one level, the mathematical structure of quantum theory is very well understood. I shall begin by sketching some of the elements of that structure in a very general, if rather abstract, version. This is mainly just scene-setting, but it is also in order to encourage reflection on the gaps between quantum theory as abstract mathematics, quantum theory as an extremely successful tool for describing and making predictions about physical systems, and quantum theory as part of a complete picture of reality — including the reality we experience.

The fundamental ingredients of the theory are States and Observables.

The set of observables K is a *-algebra with an identity 1, that is

K is a vector space over the complex numbers, so that if A, B are in K and a, b are complex numbers then aA + bB is in K.

A multiplication is defined on K, so that A, B in K implies that AB is in K.

An anti-linear conjugation * is defined on K, so that A in K implies that A* is in K.
The word “observable” is usually used to refer just to the self-adjoint elements of K but, for the present outline, it is convenient to broaden the term.

As well as algebraic properties of this type, suitable topological properties also need to be imposed on K. The details are not important here.

The standard example is when K is the set of all bounded operators on some Hilbert space and 1 is the identity operator.

In the case where this Hilbert space has finite dimension n, this set will consist of all n by n matrices and 1 will be the n by n unit matrix.

In mathematical terms, a state on K is just a linear map ρ from K into the complex numbers which satisfies ρ(A*A) ≥ 0 for all A in K and ρ(1) = 1.

If K is the set of bounded operators on a Hilbert space, then states correspond to density matrices, identified by the equation ρ(A) = tr(ρA). The idea of states as wavefunctions is inappropriate when we deal with macroscopic systems occupying thermal states. It is possible, assuming that quantum theory applies universally, that the state of the entire universe might be pure but we have no way of confirming that. It is possible, in a similar way, that the entropy of the entire universe might be constant, or even zero. Locally, just as entropy is almost always increasing, states will almost always be mixed; except in simple systems in which uniqueness is forced by a lower bound on energy or by a carefully constructed preparation method.

An abstract mathematical structure of type sketched above can be used to describe a vast range of physical systems.

It is possible to think of all of these systems as emerging within a single over-arching “theory of everything” which would itself be a quantum theory of states and observables.

Local quantum field theory provides one guess as to a staging post on the way to a theory of everything; although presumably its spacetime structures should themselves emerge from the theory of quantum gravity.

Local quantum field theory assumes the existence of a fixed background spacetime manifold M. For each sub-region Λ of M, a subalgebra K(Λ) is constructed.

The most important aspect of this construction, for present purposes, is the way in which it provides a natural generalization of the Heisenberg picture. Recall that in that picture, in ordinary quantum mechanics, states are thought of as fixed, and operators as changing with time under the appropriate Hamiltonian. In local quantum field theory, operators at different times will, in general, belong to different local algebras and a fixed global state ω will be described at different times by its restriction to those different local algebras. Spacetime, therefore, is not modelled as an arena in which things happen, but as a structure relating numbers of the form ω(A) for different operators A.

Understanding the ways that operators and states should be used in descriptions of physical systems is one of the central goals of quantum theory. For example, quantum chemistry describes molecular structures as ground states, or at least low-energy states, of electro-magnetic interactions between electrons and nuclei. Particle physics invokes scattering states of elementary, or quasi-elementary, particles. Quantum statistical mechanics models phase transitions using states of systems with infinite numbers of simple components. In each case, the theorist can provide a good explanation for the suggested form of the interaction, and the states s/he constructs have intuitively satisfying properties as well as experimentally confirmed ones.

At this level of ordinary theoretical physics, clearly a lot of what can quite properly be called “interpretation” is going on. Basic to this intuitive form of interpretation is the idea of a value ρ(A) of a state ρ on an observable A as an expectation. This idea is sometimes referred to as the “Born rule” as it generalizes Born's suggestion in 1926 that the squared amplitude of a wavefunction is a probability.

The problems of the interpretation of quantum theory arise directly from the conflict between the idea of quantum states as descriptions of physical situations and the idea of quantum states as providing probabilities. In particular, it is not clear whether the probabilities are supposed to be probabilities of physical events or probabilities for observations of physical events. Nor is it clear what is supposed to constitute a physical event or an observation.

Perhaps one thing we need is some way of defining the probability of observing one quantum state, or density matrix, σ, given another state, or density matrix, ρ.

Before considering how the mathematical structure of quantum theory might correspond to observed reality, we should consider what such a correspondence might imply.

In its broadest sense, realism is simply the idea that there are truths about reality which are independent of what we want, of what we know, and of what we can verify.

As a realist and a physicist, I hold that our possible experiences and their probabilities are affected, shaped, and ultimately determined by such truths about reality and that it is plausible that those truths take the form of mathematical laws.

Of course, it is certainly the case that our experiences depend on who we are.

A gardener, for example, may reasonably be said to see a different flower-bed from someone with no knowledge of plants.

A bat “sees”, or more precisely echo-locates, a world which is very different from that of a human being, despite there being common features, such as those which we would see as corresponding to walls and trees.

Moreover, I still retain enough of the existentialism I was influenced by in my youth to believe that the most important aspect of human freedom lies our apparent ability to choose our attitudes to reality. For example, I can choose to view my unemployment as a glorious opportunity to do what I want, or as a complete rejection by my peers.

Nevertheless, as a physicist, I am a compatibilist about free will and I believe that, ultimately, it is reality which forms both us and our experiences. Indeed, reality is what we cannot choose.

Physics and existentialism come about as close as it is possible to get to genuinely “Non-Overlapping Magisteria”.

Realism admits the possibility of ignorance. After all, why should we expect to know, or to be able to find out, truths about reality which are independent of what we want, of what we know, and of what we can verify?

Realism suggests that not every unanswerable question is meaningless. But we do not know the extent of our ignorance.

Acknowledging our ignorance is useful when it allows us to get beyond,
“no reasonable definition of reality could be expected to permit this”

For example, it is only after we have realised that we know nothing about simultaneity that we can start setting exam questions about getting long poles into short garages.

When we realise that do not know that quantum theory is complete, then we can start developing alternatives. Bohm theory is possible only because we do not know that there are no hidden variables.

It is a mistake to forget about our ignorance and, for example, assume, as theoretical physicists sometimes do, that a theory has to be consistent because we are sure that is true. When we get to the edge of what is possible in the way of direct empirical test, as we seem to have with string theory as well as with the interpretation of quantum theory, seeking to discover how and whether an idea can be made to be consistent remains an important way of developing and testing that idea. Being satisfied with a hand-waving assumption that everything will work out eventually, seems often to slide into insistence that there “is only one game in town”. As described by Beller (1999), Bohr, for example, developed a tendency towards such insistence.

On the other hand, it can also be dangerous to acknowledge our ignorance, if it stops us taking our theories seriously. The many-worlds suggestion is an intriguing idea, but if we don't pick relentlessly at the concept of a “world” we will never know what the suggestion really means.

Mathematicians, who do not have experimental refutation to worry about, are used to taking their theories very seriously, and yet they also understand the importance of always being on the look-out for a counter-example.

There is a tendency among those who are unhappy with the difficulties of realism to believe that there are limits to the kinds of questions that we should ask. In particular, it is tempting to suppose, in as far as we can make sense of the supposition, that we should only ask questions to which we can get direct empirical answers.

Over-use of our imaginations can certainly lead to trouble, as, for example, if we ask for a particle's exact position and momentum.

Ultimately, however, we can only get direct empirical answers to questions about what is, not about what might be.

In quantum mechanics, Einstein, Podolsky, and Rosen, and Bell, and Hardy have shown that we need to be cautious about the conclusions of some counterfactuals. Just because every time Alice sees P+, Bob sees P- when he looks at P, does not mean that Bob knows he has P- when Alice sees P+ and Bob looks at Q.

Nevertheless, Einstein, Podolsky, and Rosen, and Bell, and Hardy also demonstrate how important it is to be imaginative.

Moreover, in more general senses, counterfactuals are fundamental to scientific investigations, to scientific theory, and even to consciousness itself. The purpose of an experiment is to discover what happens under certain circumstances, so that we can imagine what would happen were we to repeat those circumstances. A mathematical theory is defined in terms of hypotheses about what would happen given initial conditions from some entire space of possibilities. Consciousness sees through what we are to a possible world that it represents to itself. We always live with what might be as well as with what is.

So counterfactuals are both fundamental and dangerous. We must ask questions about them. We always have to investigate their roles; being careful, in particular, to learn the boundaries between the kind of possibilities we can physically achieve and the many other kinds we only imagine; and between the counterfactuals which define a theory and those which merely have an explanatory role.

The idea that we should only ask questions to which we can get answers by empirical means, slides into the claim that questions to which we are unable to gain answers are meaningless, and from there it can lead either to self-satisfaction or even to a general intellectual laziness.

“there can be no question of any unambiguous interpretation of the symbols of quantum mechanics other than that embodied in the
well-known rules which allow to predict the results to be obtained by a
given experimental arrangement described in a totally classical way” Bohr 1935

* * * * * * * * * * * * * *

“If two scientists have different states of knowledge about a system, they will assign different quantum states, and hence they will assign different
probabilities to the outcomes of some measurements.” Caves, Fuchs, and Schack 2001

This is a statement I agree with, but not when it is made in the context of other statements such as:

“Can there
also be a `microscopic reality' where every detail is completely
described? No description of that kind can be given by quantum
theory, nor by any other reasonable theory. [ . . . ]
It would eventually have to encompass everything in the universe,
including ourselves, and lead to bizarre self-referential logical
paradoxes. The latter are not in the realm of physics; experimental
physicists never need bother with them.” Fuchs and Peres 2000

“Quantum mechanics offers an insufficient basis for a
theory of everything if everything is to include consciousness.”

Mermin proposes a theory of “correlations without correlata”. Attempts to justify not investigating the nature of observers have depended on ideas which seem to me to solve the problem only in as far as it is reasonable to claim that the time can be known without having anyone to look at a clock:

“just as someone who accepts the tenseless conception
of time can readily accept instants i.e. spacelike slices of spacetime, as (i)
useful or even indispensable for describing phenomena, and yet (ii) not
any substantive ontological commitment additional to the commitment to
spacetime; so also an Everettian can readily accept worlds as (i) useful or
even indispensable, and yet (ii) not a substantive commitment additional to
the commitment to actuality's being described by the universal state.” Butterfield 2001

Complementary to the insistence that we should not try to step beyond the boundaries of our immediate perceptions, is the idea that we must choose, and work within, a framework or “language game”.

It may well be the case that much of our thought is formed by the language, or the theories, within which we happen to be working, but it does not follow that all frameworks are equivalent.

Realism implies, essentially, that there is one correct framework.

Assume that we have already discovered a great deal about that framework and may be able to discover even more, and we have a reason to take our frameworks seriously.

The correct framework must be complete as well as consistent. Completeness in our theories seems to be much too much to ask, but poking at gaps in a framework can be another important way of developing and testing ideas in a post-empirical era.

The pessimistic induction is the idea that all our previous theories have eventually turned out to be radically wrong, and that therefore there is no reason to believe any of our present theories. And, it is suggested, if we cannot believe any of our theories then realism is at best an irrelevance.

The premise of the pessimistic induction however does not distinguish between radical conceptual changes and radical changes in what we can measure.

When we make the radical conceptual change of believing that it is more likely that the earth goes round the sun than that the sun goes round the earth, there is no change in the hot bright light which appears in the morning and disappears in the evening. The change comes merely from the way in which certain specialists are able to analyse the apparent motions of the apparent bodies that those specialists distinguish as being “planets” rather than “stars”.

The strange properties of systems in very rapid relative motion and the even stranger properties of very small systems present us again with difficulties not in experiencing reality, but in explaining our experiences.

The pessimistic induction suggests that our explanations may well not be correct, but the more we have to explain, the more interesting as well as the more difficult the task of providing any explanation becomes. The difficulties are precisely the constraints which allow us to judge where we have gone wrong in the past.

Our favoured explanations may change, but any theory which is allowable in the present should also have been allowable in the past. We did not previously develop our current theory, partly because we lack the imagination to construct a full range of possible theories without new empirical input, and partly because we have been seduced into believing that the true explanation is the simplest explanation.

The pessimistic induction suggests that the true explanation is probably not the simplest explanation of whatever our current empirical abilities have shown us about reality — perhaps not even the simplest complete and consistent explanation. Nevertheless, the inductive evidence that simple theories are powerful is at least as strong as the evidence suggesting pessimism.

There is a limitless abundance of arbitrarily complex theories. In my opinion, scientific progress has not called reality into question, but it has provided us with some bewildering questions about the nature of reality. Among these, not the least is the question of how and to what extent that nature can be said to be simple.

It is tempting to suppose that there was only one viable theory at any time in the past, and that there is only one viable theory today, but I do not believe that either of these suppositions is correct. I think that a realistic form of realism requires us to see ourselves as doing no more than trying to match our theories to reality with no guarantee of success. Some theories, like the idea of being a brain in a vat, are much much less plausible than others; some, like Newtonian mechanics, have limited application with limits we understand; some, like string theory, are work in progress; some, like decoherence theory, are significant but with gaps too immediate to ignore. I do not see the proper goal of scientific theory as being the discovery of the one true theory of everything, but rather as being the investigation of the best and most complete theories we are able to find. There is no reason why those theories should be unique; although there may well be social pressures which will tend to produce dominant preferences. As I say, after some criticisms of the Bohm interpretation in my FAQ:

“the Bohm interpretation is a serious and interesting attempt to answer the fundamental ontological problems raised by quantum theory.

“It is possible that there may be more than one radically-different way in which quantum theory could be understood and each of these ways might be just as compatible with empirical evidence. In my opinion, we should pursue all such ways. We may not be able to learn the truth, but we can at least circumscribe different possibilities.”

Finally, while refutability is a significant property for a scientific theory, the lack of refutability cannot be sufficient as a refutation. The universe might indeed, for example, be as multiple as the “landscape” of string theory seems to suggest. If it is, it just is, and we have no more right to deny it than a caveman would be right to claim that the Earth was flat because he was unable to travel more than fifty kilometres in a day. All we can do, if we care, is to develop and test the theory until we understand it, and then decide for ourselves what we think. One line dismissals of, say, Marxist history, psycho-analysis, intelligent design, or many-worlds theory are worthless except as summaries of more thoughtful investigations.

Reality, at least on very small scales, really does seem to be pretty weird.

2) The conceptual claims are bold.

For example, it may be claimed that quantum theory is universal despite the huge gap in the number of dimensions between the systems with small numbers of identifiable states that we are actually able to manipulate and the macroscopic systems such as cats which we like to talk about.

This gap is between numbers like ten, and numbers like ten raised to the power of 1026. This is a gap fantastically larger than the gap, which sometimes makes people doubt the plausibility of work on any variety of quantum gravity, between the length scales of say 10-15 metres directly relevant to particle physics and the Planck scale of 10-35m.

It may be claimed that special relativity is unassailable — in particular that there is obviously no preferred frame,

or that quantum theory is complete — in the sense suggested in the quotation by Bohr given above.

It may even be claimed, despite the history of conceptual shifts mentioned in the discussion of the pessimistic induction, that we already have a reasonable understanding of the fundamental nature of reality, so that to be a realist one has to be a materialist and indeed to think that particles are just wee balls.

3) An interpretation of quantum theory needs to specify some sort of “classical regime”.

Such a regime may consist of a quasi-classical “world” or of many such “worlds”,

or of some sort of pattern, or series, of events, such as

spontaneous collapses,

or neural events,

or particle positions.

The pattern may be defined by a series of projections,

perhaps projections of measured operators,

or by a consistent history.

The classical regime provides something for which a quantum state can give
probabilities and defines the states which ensue in consequence: for example, if A = ∑i ai Pi is a “measured” operator, then a state |ψ>i|ψ>i/pi with probability pi = i|ψ>.

4) Avoiding ambiguity in the classical regime is difficult.

What precisely is a world /a spontaneous collapse / a neural event / a particle / a time / a measured observable / an observer / or a relevant consistent history?

The probabilities given by the interpretations depend on the exact definitions and this dependence is naturally magnified over time, especially in as far as there are lots of “state changes” or apparent state changes in the ordinary functioning of some macroscopic systems (human brains in particular).

5) Half-baked ideas often fail.

For example, consistent histories fails because the set selection problem was not addressed.

The modal interpretation fails because instabilities in the eigenprojections of reduced density matrices were not considered.

The Bohm interpretation fails because it fails to be compatible with special relativity and with quantum field theory.

Nevertheless, tempting failure is the only way to make progress. Leaving an idea half-baked is a dereliction of intellectual responsibility.

6) It is all too easy to assume the viewpoint of an implicit and
uncharacterized observer.

“Quantum theory is a theory of correlations.”

“Quantum states are states of knowledge.”

“not any substantive ontological commitment”

But if there is no explicit assumption of any sort of physical wavefunction collapse process,

something which applies in many-worlds interpretations, in decoherence theory, in consistent histories, and, to at least the “guiding wave” even in the Bohm interpretation,

then the true, or initial, or fundamental, or background, state of the universe (Everett's “universal wavefunction”) is profoundly alien.

Typical quantum states of physical systems in their normal environments
are mixtures with appropriate probabilities of quasi-classical quantum states.

The Argument.

Physical processes involve rapid sharing of information between system
and environment.

Localization information is particularly easily shared.

In particular, when a mesoscopic or macroscopic system, such as a dust particle or a cat, interacts with an environment of much smaller particles, such as the atmosphere or daylight, the result of the interaction can be modelled by a process of the form

∑n an ψloc, n ⊗ φ → ∑n an ψloc, n ⊗ φn.

Here ψ = ∑n an ψloc, n is an expansion of the originally arbitrary wavefunction ψ of the larger system into localized components, and the display is supposed to show how the wavefunction φ of the bath gains information about the position of that system.

The result of this process, assuming that the φn are orthonormal, is that the reduced density matrix of the larger system changes from a pure state |ψ>

∑n |an|2 |ψloc, n>loc, n|.

The resulting components of this mixed state (the localized states |ψloc, n>loc, n|) are fairly stable under further environmental influences.

There is a splitting of the quantum state ω of the universe into local
subsystems.

Some of those local subsystems have states which split into quasi-classical states (which need not necessarily be pure).

We are (each) one of those local subsystems and are (each) aware, at each
moment, of one of the quasi-classical states of our
brains / bodies / surroundings.

From moment to moment, we move from quasi-classical state to
quasi-classical state.

Problems.

What is a quasi-classical state?

Are the probabilities “appropriate”?

How is the temporal progression from quasi-classical state to
quasi-classical state characterized? Is it well-defined?

Do we experience our past from our present records and memories?

Or by having experienced our past?

How / Why / Where do “we” come in?

Is this really a theory without implicit observers?

Does ω split naturally?

Indeed, if decoherence theory were really a theory without implicit observers, then it would have to be the case that the universal quantum state naturally and, at least in some sense uniquely, splits into a high-dimensional tree. Only then could we explain our individual historical observations by saying that they follow the splittings of some individual underlying branch. A palimpsest of multiple over-lapping branchings, as presented by pre-decoherence consistent histories theory, is not much use.

In the circumstances discussed above, which is a somewhat special case, the primary issue is whether the localized components ψloc, n can be unambiguously identified. Having defined suitable components, the theory shows that if a new particle was introduced in an arbitrary pure state, different branches would rapidly form and the contrast between them would rapidly sharpen.

For more general cases, we can see that some simple systems provide models with clear tree-like structures. This is the case, for example, in the models presented by Everett to justify his theory — something which is hardly surprisingly as this is precisely what Everett was attempting to exemplify.

On the other hand, the famous singlet state (with wavefunction |(+- - -+)>/√2) provides a situation which models total ambiguity of decomposition.

What about real macroscopic systems?

What is the quantum state of an unobserved macroscopic system?

Quantum statistical mechanics provides us with one plausible answer to this question suggesting that the state should probably be taken to be something resembling, or close to, a canonical ensemble density matrix — exp(-H/kT)/Z — for the appropriate temperature T.

Such states have high entropy and are thus highly mixed states with many incompatible decompositions.

Preferred decompositions may perhaps be found by asking how long the form of a given decomposition would be stable, but there would still be questions about the appropriate scale on which a decomposition should be made.

For example, should we decompose the state of the atmosphere into components representing each individual molecule as well-localized, or should the components merely represent as definite the pattern of local fluid motions required to specify the smoke drifting from a cigarette?

At first sight, it hardly looks as if this question is important. The motion is essentially classical at both the small scale (say nanometres) and the larger scale (say microns), and so all that is involved in passing from small to large scale is coarse-graining involving a summation over 109 terms. Nevertheless, the classicality is only approximate, and there is no “natural” scale.

When we turn to trying to describe neural systems, as we must if we are to address the nature of “observers”, the issues become much harder.

Being warm and wet, it is obvious, of course, that neural systems are “decoherent”.

Nevertheless, a brain is a vast patchwork of metastable fluid systems, with the timing of each neural firing linked to that of many others.

Once again, there is no “natural” scale for state decomposition.

Different scales correspond to different branching structures which, because the system is only quasi-classical, need not be mutually compatible.

Neural instability means that minor incompatibilities amplify with time. The feeling that this does not matter, because however we make the decomposition we are only uncovering what is actually happening, is a natural one, but entirely misses the point, that in the universal quantum state all possibilities are, in some sense, “actually” happening.

There is no “preferred basis” of brain states. Nor are there definite “pointer states for neurons”.

Counterfactual determinations of such states, by searching for the states most stable under possible environmental influences, fail to lead to unique identification,
particularly because a scale does need to be chosen, but also because there is no unique measure of stability.

Functionalism, which suggests that mental states can be analysed in terms of how an individual would tend to behave under varying circumstances, invokes similar counterfactual determinations. In my opinion, in both cases, we have to do with merely explanatory counterfactuals, rather than with theory-defining ones.

Recall that an interpretation of quantum theory needs some sort of “classical regime” and that it is difficult to avoid ambiguity in the definition of that regime.

Recall also that the central purpose of an interpretation of quantum theory is to explain “our” observations.

The minimal classical regime would be that which merely explains “our” observations.

Who is “us”?

The absolute minimal classical regime is that which only explains my observations, but solipsism is an absurdity, so the task becomes to find a classical regime which is sufficient to define human beings as observers. This has lead me to attempt to analyse in an abstract way the physical expression of information in the brain.

As I have explained above, there does not seem to be a “natural” decomposition of the states describing an unobserved human brain. This means that to base a fundamental theory on a characterization of the physical expression of information in a brain requires what amounts to the identification of previously-undiscovered physical laws. This is not a process without ambiguity.

Non-locality is indeed weird, but it can be explained by assuming the existence of many futures for each observer. When Alice and Bob perform distant measurements on a shared singlet, Alice is not affected by Bob's observations until, for example, they meet and he tells her what he has seen. But the empirical failure of Bell's inequalities, shows that, without doing some sort of violence to the concept of locality, Alice cannot assume that what Bob will say he saw was, in fact, being seen by him at the time that he will say that he saw it. An alternative is to suppose that Alice's conversations with Bob constitute her observation of Bob's results. This supposes, effectively, that, as far as Alice is concerned, Bob was in a superposition of states until the meeting. Ruling out solipsism, this implies that many futures are required for each observer.

In the balance between the classical regime and whatever it is which lies beyond the classical regime — which we shall refer to here as the “quantum regime”, it is perhaps usual to think that the quantum regime is the more significant. The quantum regime, after all, is what the mathematical structure seems to be about. Nevertheless, the classical regime is the regime which explains our observations. In both the Copenhagen interpretation and the Bohm interpretation, one might well take the classical regime to be central. In my many-minds interpretation, there is no question. As I mentioned above, the total universal state ω is profoundly alien. It may also be presumed to be simple — perhaps even to be a vacuum state. According to this analysis, all the significant information which we see around us exists within our own individual structural histories. We exist by randomly becoming choices of possible futures and then those choices are fixed into our pasts. We look at a cat which, for us, may be either alive or dead, and what we see becomes part of what we are.

It might seem rather absurd to suppose that a whole can be less complex than its parts, but a spacetime may be far simpler than some particular spacelike hypersurface of that spacetime. In mathematical terms, the idea of the uniform (Lesbegue) measure on the real numbers between zero and one is fairly simple. Nevertheless, almost all individual reals are infinitely rich in structure. In a similar way, in many-minds theory, the complexity is in the elements rather than in the totality, and to such an extent that it is reasonable to refer to the theory as a form of idealism. The ontology is centred on individual minds, as of course is the reality we experience.

“When in broad daylight I open my eyes, it is not in my power to choose whether I shall see or no, or to determine what particular objects shall
present themselves to my view” Berkeley 1710

Reality is what we cannot choose.

For Berkeley reality is God:

“the ideas imprinted on the senses by the Author of Nature are called real things”. ibid

In a many-minds interpretation, reality is the (physical) laws which
determine the possible futures of a mental structure.

These laws include the definition of the quantum theory, including the local algebras and the maps which relate them (or perhaps some other generalization of the unitary time development e-itH of quantum mechanics without collapse); the definition of the total universal state ω; and the laws defining the structure of a mind, the possible time developments of that mind, and the probabilities of those possibilities.

In this seminar, I have taken a fairly old-fashioned attitude to reality; suggesting that it is real, and that we have at least something of a handle on it.

I have also suggested that if we want to understand reality, we are going to have to work hard, and yet that it is not within our capability to do any more than come up with theories describing what we think we have discovered. We may not even be able to come up with a unique best theory, let alone to agree that we have.

Nevertheless, I have rejected the pessimistic induction because I believe that our ordinary understanding of scientific progress is quite sufficiently sophisticated for us to be able to cope with radical conceptual changes without entirely losing our bearings.

I have emphasized the importance of taking our theories seriously and suggested that as direct experimental test becomes more and more difficult, testing for consistency and for completeness becomes more and more useful. It might, for example, have been possible for a sedentary people to have postulated that they could be living on a globe, just by developing a sufficiently consistent and complete theory of astronomical observations.

I have looked at some of the gaps in decoherence theory, and introduced my own many-minds interpretation.

What then have been the benefits, if any, of my taking my theory as seriously as I could?

As far as I was concerned, I just took Everett's thesis and tried to fill some of the gaps in the story it seemed to be telling.

I did not manage to find an entirely straightforward way of doing this:

“Your [ . . . ] attempt to place on a rational footing the [ . . . ] ‘many-minds’ idea — that each stream of consciousness is a mere individual branch of a tree of diverging and contradictory streams of consciousness, all equally real in the large objective sense — appears, in view of its intricate ad hoc nature, to be more like the death rattle of a collapsing radical idea than the foundation of a viable theory of natural reality.” Stapp 2004

It seems to me that some of the problems that gave rise to this intricacy — in particular, the problems of characterizing the elements of the classical regime and of avoiding the assumption of an implicit observer — are yet to be addressed in other approaches.

On a more positive side, I do believe that I have a viable interpretation of quantum theory.

This interpretation seems to me to have a plausible ontology.

Minds should be fundamental. The “hard problem” of consciousness — the problem of why minds should exist in a material world which looks as if it would function perfectly well without them — is dissolved if reality is actually a world of possible mental structures. Moreover, I think it sensible that minds should be defined by their histories, rather than by what they are at an instant, let alone by how they would behave in circumstances which merely might occur.

In a wider sense, the ontology of time that results seems to me to have some advantages.

My interpretation is an interpretation aimed at making sense of the sort of quantum theory which can be expressed in the form in the first section of this seminar. In such theories, there is a general abstract idea of time as a relation between expectations of different operators on a single state. In a quantum gravity theory, I suspect that it might be possible to generalize this idea to a class of possible relations between expectations. This I think might well remain consistent with the quite different and more fundamental form of time in my theory, which is an aspect of our individual structures as individual observers.

I propose that the experience of any observer is that of observing a particular, identifiable, discrete stochastic process. Here the probabilities are objective numbers explicitly definable for each individual observer at each appropriate moment, and providing, for an observer at a moment, the relative likelihoods of seeing elements of a finite set of possible immediate outcomes.

One final outcome of this picture is that any person you choose to consider is leading one of the lives that you might have led had a finite sequence of stochastic events come out differently.

Some of my remarks here may considered as responses to ideas in Quantum Theory and the Flight from Realism by C. Norris (Routledge 2000), which I re-read in the course of writing this seminar. I recommend the book for an introduction to some of the philosophical issues.