Both Mathews and Schaffer advocate a monistic metaphysical view where matter is effectively reduced to space-time.

I agree with these authors that the dual scheme of {space-time container plus material objects} must be rejected, but think they are slightly off-track in wanting to reduce the properties of matter fields to space-time (at least space-time anything like we currently think of it).
These brief comments focus on the relationship of this idea to the work of theoretical physicists. Mathews acknowledges that an early attempt to derive this reduction from general relativity failed (Wheeler’s Geometrodynamics), but still likes the metaphysical vision for philosophical reasons. In his paper Schaffer argues toward a similar goal, but along the way I think he overstates the degree to which GR and (especially) quantum field theory as we know them are congenial to this vision. QFT has matter fields housed in a separate space-time container. In GR the matter and space-time are dynamically intertwined, but the fact that you can model the geometry while leaving out matter shows that they remain distinct.

In some ways the quest for a theory of quantum gravity can (should?) be viewed as a quest for a monistic theory which is rid of the dual scheme. I continue to try to follow the different theories as a layperson to see how they come down on this issue.

String theory: originally an extension of QFT which retained the feature of having fields on a background space-time. Has evolved in many ways over the years and maybe can overcome this starting point (?).

I would note that if the latter sort of approach works, it doesn’t support Schaffer’s advocacy of priority monism (see my previous post on this topic). The underlying network would not be a very coherent whole, but a fairly ill-behaved evolving pluralism of micro-events. Even though Schaffer wants to overcome the container/object scheme, his view of space-time as the holistic fundamental object still has a bit of a hangover from the container idea in my opinion.

Tuesday, June 24, 2008

Surprising and promising results have not come too frequently in quantum gravity research, but the Causal Dynamical Triangulations program led by Renate Loll, Jan Ambørn and Jerzy Jurkiewicz had an exciting moment in 2004. A computer simulation showed that a space-time model with the right dimensionality arose from a path integral superposition of fairly generic microscopic geometric building blocks. The team has kept up a steady stream of research investigating and seeking to extend this result, and now they have published a popular article in the latest Scientific American.

I recommend the article, if one has access to it. I also discussed (as best I could) the basics of the CDT approach in this earlier post, so I won’t repeat all that here. Also, I coincidentally had just read a recent paper by the team which showed how they have generated not just the right dimensionality, but also specifically find a de Sitter universe in a simulation.

The thing I find most interesting about CDT is that it may give some evidence that selecting asymmetric time and causality as fundamental features is important in quantum gravity (a caveat is that their simulation uses globally synchronized time, and I wonder if they can relax this assumption).

One reason to be cautious is that CDT at this point only deals with space-time, not matter. Like in Loop Quantum Gravity, there is an expectation that matter fields can be coupled to the theory later on. This is in contrast to research by Fotini Markopoulou and Olaf Dreyer (see posts here and here), who think that it is the matter fields which are to emerge from a micro-quantum substrate, and that space-time geometry is to be inferred from the matter. If this works, it seems conceptually more appealing, since you’ve dealt with both space-time and matter at once. (See also this recent FQXi article on Markopoulou and Dreyer).

One last interesting aspect which I hadn’t thought much about before was discussed in the recent paper. This is the fact that CDT (and I assume other “emergence-style” programs) need to be investigated by computer simulation, rather than by deriving a specific analytical result through mathematical formalism. This essentially means giving up on the traditional idea of a “theory of everything” which can be written down in a set of equations. The CDT team doesn’t see this as a weakness, and cites condensed matter theory (see also this post) for example as a field where emergent behaviors are profitably studied without the possibility of precise description at the micro-level. They also invoke the idea of “self-organizing” complex systems. Here’s a quote:

“Think of quantum gravity as a strongly coupled system of a very large number of microscopic constituents, which by its nature is largely inaccessible to analytic pen-and-paper methods. This is no reason for despair, but a common situation in many complex systems of theoretical interest in physics, biology and elsewhere, and merely calls for a dedicated set of technical tools and conceptual notions.”

Wednesday, June 11, 2008

Patrick Suppes and Jose Acacio de Barros wrote a paper called Quantum Mechanics and the Brain (HT: Clark’s sideblog). Obviously the title was irresistable to me. However, it turns out that the title is a bit misleading. The paper has a few paragraphs discussing previous proposals regarding the role QM might play in the brain and briefly gives the authors’ views on these -- frankly without saying anything new. The paper then shifts to what is its main focus, which is to highlight an interesting instance where there is a good indication that biological systems do exploit quantum level phenomena. This is the animal eye’s demonstrated ability to react to a very faint light signal: specifically the ability to detect and respond to a handful of photons, and probably a single photon.

The paper describes previous experimental results with insects and animals. Sensitivity to single photons was inferred using statistical methods in some of these tests (some of the older references in the paper were also discussed in this Usenet posting from 1996). The authors then outline a proposal for future experiments to confirm the results by utilizing the technological ability we now have to shoot single photons.

This result would demonstrate that organisms are sensitive to the quantum realm. While we remain far from understanding the role quantum effects might play in conscious experience, this is another step toward exploring the subject.

Friday, June 06, 2008

In the standard formulation of Quantum Mechanics, there are two processes: in the absence of a measurement, a system described by a wave function evolves continuously and deterministically; when a measurement is made, the system instantaneously takes on a specific value according to the property being measured (often called a “collapse”). Everett’s Relative State formulation of Quantum Mechanics, the most famous version of which is called the Many-Worlds Interpretation, seeks to drop the collapse process from the theory. The interpretation then struggles to explain why we have the experience of measurement outcomes that we do (e.g. all outcomes are still happening but in many different worlds).

My common sense objection to the interpretation has always been based on the roots of the theory in scientific experimentation. QM was formulated to explain the phenomena we observe in the microscopic realm, including, obviously, the outcomes of measurements! It seems perverse to drop the process describing measurement.

Why then is the interpretation fairly popular? It is because the collapse process seems mysterious and “unphysical”, while the evolution of quantum systems in the absence of collapse is mathematically well behaved and, despite the many new complications and nuances, remains closer in spirit to the traditional dynamics of “matter in motion”.

I don’t know for certain whether Bertrand Russell ever wrote about Many-Worlds (it’s doubtful given the chronology – he died in 1970 and Everett’s 1957 work wasn’t widely discussed until the 70’s), but he made a careful philosophical study of physics (and all contemporary science) and I think his work helps clarify my objection to the interpretation.Recall that Russell’s project was to analyze the data and methods of science and how they relate to human experience. He described how the realms of physical theory and experience can be brought closer together in an ontology of events and their causal relations.

Here are some quotes from Russell’s Human Knowledge (which I’m reading now, inspired by Carey Carlson’s references to it in his book):

Here’s the quote that first made me think of Many-Worlds: “Mathematical physics contains such a superstructure of theory that its basis in observation is obscured.” (p.41)

We need to remember: “…a datum for physics is something abstracted from a system of correlated psychological data.” (p.59)

And: “There is here a peculiarity: physics never mentions percepts [experiences] except when it speaks of empirical verification of laws; but if its laws are not concerned with percepts, how can percepts verify them?” (p.219)

The answer is that it doesn’t make sense to see the entities described in physics as something disjoint with the events of first-person experience. Science can be seen as describing a causal network which connects up in consistent ways with experiential events (think of how the human experiences of telescopic images provide the raw material and ongoing testing for astronomical and cosmological theories). Therefore, reality is best viewed as a web of events consisting of the directly experienced and the indirectly inferred, the latter of which are the usual target of physical theories.

In quantum physics, the wave function is derived as a description of reality which connects consistently with experiential observations. It has this in common with other theories of physics. Of course with quantum physics there is always a twist. The twist is that, unlike the entities of other physical theories, the wave function can itself only be viewed as a fully objective physical entity if we pretend that the measurement events don’t exist! Here we are unavoidably given a choice to elevate the reality of the directly experienced events (measurements) or the inferred physical entity (the wave function). The events must take priority since they are the part of reality which is not known through inference. In fact quantum measurement events are the best candidates to be the raw material for a consistent ontology of the concrete world.

Given all this, I think the best interpretation of quantum mechanics is a version of the relational or perspectivist approaches. In this interpretation, a relational network of measurement events (or interactions) constitutes the concrete world. All quantum systems interact (perform measurements upon each other) with no ontological distinction between the macroscopic and microscopic realms (or the human and non-human). The wave function in this interpretation describes a system’s propensities for interaction outcomes from the perspective of a particular measuring system.