Post navigation

Randomness in Nature II

What is the explanation of the apparent randomness of high-level phenomena in nature?

1. Is it accepted that these phenomena are not really random, meaning that given enough information one could predict it? If so isn’t that the case for all random phenomena?

2. If there is true randomness and the outcome cannot be predicted – what is the origin of that randomness? (is it a result of the randomness in the micro world – quantum phenomena etc…)

Before I give the floor to the commentators, I would like to mention a conference on this topic that took place in Jerusalem a year ago. The title was “The Probable and the Improbable: The Meaning and Role of Probability in Physics” and the conference was in honor of Itamar Pitowsky. Let me also mention that the Wikipedia article on randomness is also a good resource.

One way to think about what it means to say that a physical process is “random” is to say that there is no algorithm which predicts its behavior precisely which runs significantly faster than the process itself. Morally I think this should be true of many “high-level” phenomena.

E. T. Jaynes took the position that “probabilities do not describe reality — only our information about reality.” Many supposedly random phenomena are perhaps more accurately called ergodic, deterministic with good mixing properties. Like pseudo random number generators. We can effectively model phenomena as random without settling the question of whether they are “really” random. I suppose that’s a positivist perspective.

Proaonuiq: Reality and our models of reality must be distinguished. Regarding reality it is still unclear if there is a truly random phenomenon in nature (radioactive decay ?). Regarding our models, if you are lucky enough, phenomena in nature, can be captured by a random rule ; with more luck, by a deterministic rule (which can be seen a special case of rendomness)…but if unlucky they might be no captured at all.

For an example such as lightning, the standard explanation in physics is that macroscopic randomness ultimately comes from thermal entropy via processes such as Brownian motion and “the butterfly effect”. By Brownian motion, I mean here that thermal motion causes vibration of particles, not just that the Brownian stochastic differential equation models this or that behavior. The butterfly effect is the statement that many physical systems such as air flow exhibit chaotic dynamics. Air flow is modeled by the Navier-Stokes equation to good approximation, and that plus a percolation model gives you lightning.

Thermal entropy in turn comes from two sources. It comes from quantum randomness, and it also comes from chaotic dynamics. One explanation of quantum randomness is the phenomenon that if A and B share an entangled state with no entropy, then the marginal state of A alone has entropy.

Even though there are these different flavors of randomness, it is impossible to make clean distinctions among them, or between them and the gambler’s view that randomness is incomplete information. Certainly incomplete information is sometimes a better description of randomness in higher biology. The behavior of a dog may look random to you, but maybe the dog knows its plans and didn’t tell you. (Or instead of a dog, another human!) It is perhaps a philosophical point, but my own view is that all randomness is equivalent and that the distinctions are secondary.

Gil has asked me to elaborate on a comment I made on one of the answers to the MathOverflow question mentioned above. This has to do with the perceived “randomness” in quantum mechanics. I should preface by saying that I don’t claim to possess any special insight into “randomness” largely due to not having thought about it seriously. The point of my original comment was simply to remark that time evolution in quantum mechanics is deterministic.

In classical mechanics time evolution is given by the flow of a hamiltonian vector field X_H on the phase space of the system under study. This is a first-order ordinary differential equation and hence provided that that the hamiltonian is sufficiently differentiable, standard theorems guarantee the existence and uniqueness of solutions to the initial value problem. In other words, through each point in the phase space, there passes a unique curve x(t) whose velocity x’(t) is given by the value X_H(x(t)) of the hamiltonian vector field at x(t). This is the statement that classical dynamics are deterministic. This does not mean that classical phenomena cannot exhibit “randomness” (just watch this video of a double pendulum: http://www.youtube.com/watch?v=z3W5aw-VKKA ) or chaotic behaviour (will it rain in Edinburgh next Thursday?). This is not due to lack of determinism, but due to incomplete knowledge of the state of the system.

Similarly, quantum mechanics is deterministic. A (pure) state of a system is given by a unit-norm vector \psi in a Hilbert space and its time evolution is even simpler: it is now a linear ordinary differential equation (albeit in an infinite-dimensional setting) which says that the time derivative \psi’(t) = H \psi(t), where H is the hamitonian. Assuming that H is self-adjoint, the Stone theorem says that there is (strongly continuous) unitary U(t) such that \psi(t) = U(t)\psi(0). Hence if you know the state of the system at time zero, you know the state at any other time. A similar story applies to mixed states, which are convex linear combinations of pure states.

Although it is hard to be certain, when most people refer to the “randomness” in quantum mechanics, they are usually referring to the probabilistic nature of quantum physics. There is a LOT one could say about this. The main problem, I think, stems from trying to get a quantum system to answer classical questions. The point is basically that classical concepts are in some sense approximations to what’s really going on. Of course, our intuition is classical, so it should not be surprising that when we try to apply it to quantum phenomena we find some strangeness. To take but one example, the trajectory of a classical particle makes perfect sense. Hence in the double-slit experiment one might think that it’s valid to ask the question “which slit did the photon go through?” This only makes sense because we are assuming that the photon follows some classical trajectory. In quantum mechanics this concept does not exist, so it is not surprising that one cannot give a precise answer to that question. The best compromise is a probilistic answer. Same thing applies to the question of “when will a certain particle decay?”, which has killed (or not!) so many imaginary cats in thought experiments.

This story can be made mathematically very precise. In his book on the mathematical foundations of quantum mechanics, George Mackey has what to my mind is a beautiful treatment of the structure of classical and quantum mechanics and of their statistical counterparts from the perspective of the measurements one can make. (The book dates from the 1960s and I read it as a graduate student in the 80s. I have not worked on this topic, so I am not aware of what must certainly be a sizeable modern literature on the topic.)

proaonuiq

José, a clarifying answer !

This comment is just to remaind again that both classical and quantum mechanics (which is a good but possibly not the definite) are models of reality but not the reality himself.

For me, having neither think about it in deep, so intuitively, what we, observers, call a random phenomenon in reality is just a phenomenon that we can be described with a probabilistic model: for instance a phenomenon in an experiment (such as the double-slit experiment you quote) for which in response to “exactly” same conditions different outputs are obtained, but in such a way that they can be captured by a probabilistic model (in wide sense).

So as you point, when an observer says “this phenomenon is random because i can describe it with a probabilistic model” he is making an epistemological assertion not an ontological one. It is not impossible that as knowledge advances this epistomologicaly random phenomenon can be explained later in a more deterministic maner (that is, same conditions, always same output).

José: I can’t really address the issue in just one blog post; I would want at least an hour lecture if not an extended seminar or a long e-mail discussion. But as a student of quantum probability, which is the same as non-commutative probability, I think that the deterministic interpretation of vector states in quantum mechanics is untenable.

A hopelessly short version: A mixed state is the natural quantum generalization of a probability distribution, and a pure state is a mixed state that happens to be extremal. If the algebra of random variables is all bounded operators on a Hilbert space, then the vector states that you emphasize correspond to pure states; but if it is only some of the operators, then actually a vector state can represent any mixed state.

Liza

Thank you Gil for promoting this discussion, I have learned a lot.
It is a great pleasure seeing people from the academia taking part in such public discussions.

My tentative take on the matter is this: Whenever we try to describe or to control (or to grasp) a physical process, randomness reflects effects external to our model. Those can reflect interaction of the process with the environment and also small scale ingredients which effect the process and are neglegted by the model. Before Liza’s question, I thought that, at the very least, randomness is a “genuine” phenomena in quantum physics. (And also that this is manifested by the uncertainty principle.) However, Jose’s answer made this matter less clear for me.

Thank you Liza for initiating this. I’d certainly be happy to learn more.

Yes, Greg’slast comment was eaten for a few days. I am not sure if I understand why Greg regards the deterministic point of view of QM as untenable; and if his view applies to randomness in classical mechanics as well.

Moving a bit from “randomness” to something else perhaps– In biology what looks confusing or improbable (…nonfunctional… etcetera…) and even sometimes random at one level of analysis sometimes resolves into clarity at a so-called lower level. “Problems” at the species or organism levels sometimes achieve much better clarity at the gene level. The key search phrase these days appears to be selfish genetic elements. The use of the phrasing co-replicon in the Tooby and Cosmides 1981 Journal of Theoretical Biology paper on intragenomic conflict, where sets of genes that tend to pass into future generations through similar means (“co-replicate”), is perhaps the most useful unit of analysis… in my (not-falsely-humble) opinion. Also, there are powerful principles like frequency-dependent selection that result in diversification away from patterns most likely to be evolved toward as targets for predators and pathogens. Thus there is a constant dynamic of interplay between universals and differences, as in the case for example of evolutionary psychology where the founders strongly emphasized the search for human universals but advances in behavior genetics and other areas keep stretching evolutionary psychology more and more into the sub-fields of psychology sometimes lumped together as differential psychology. (Yes it’s ridiculously complicated, and that’s why it’s so much fun!)

In terms of the randomness of mutations, there is the concept of the mutational meltdown. Ridley wrote a treatment of this for non-technical audiences (back in 2002 or some other ancient time), but the concept strongly influences various areas of biology such as fitness-indicator theory, aging, and so on. (Long url, so search Google Books for mutational meltdown Ridley.) Also, lately there’s been a lot of … yak-yak about the fact that some parts of DNA sequences are more prone to mutations than others. People keep reading all sorts of things into that, as people are wont to do. It’s old-old news. Recently, methionine related sequences have been emphasized in the aging-processes literature for example. But possibly that’s just associated with scrambling the targets of pathogens as part of basic immune system strategies. (After all, multicellularity took an extremely long time to evolve, and becoming big juicy targets for smaller organisms likely had a lot to do with that.)

If we examine a macroscopic measurement … say of cantilever position using an optical interferometer … we find that that a macroscopic measurement is made up of a very large number of microscopic measurements … say 10^15 clicks, distributed between a pair of photodiodes.

There are so many ways that those 10^15 the clicks can be distributed between two photodiodes, that Kolmogorov/Chaitin theorem guarantees that our experimental record will be algorithmically incompressible, that is, random.

So the real mystery is not, “Why is nature/quantum mechanics random”— Kolmogorov/Chaitin explains *that* quite rigorously—but rather “Why does nature/quantum mechanics show any order at all?”

If we take “order” to mean “algorithmic compressibility”, then the answer from orthodox QM is simply “Devices like eyes, brains, and photodiodes are designed—by evolution or by human ingenuity—to dynamically generate algorithmically compressible outputs.”

In other words, Nature becomes ordered by the dynamical process of our looking at it with well-designed eyes, and thinking about it with well-designed brains.

This explanation is reasonably satisfactory for engineers and evolutionary biologists, at any rate … philosophers may be more skeptical! :)

Dear John, the question “Why does nature/quantum mechanics show any order at all?” is, of course, very interesting. (The related question “why there is something rather than nothing” is of great interest in philosophy.)

After reading in diagonal the abstracts of these Van Leer conferences i must conclude that the much i read about QM the much i think about the double edged sword of some formal systems, which when used as models of reality, as well as they give light, they can dazzle; if with such a great but rough probabilistic approximation we can do so much ¿what couldn´t we do with the ¿deterministic? definite theory QM is dazzling ?

P.D.

I found specially interesting Wilce´s conference (by its abstract). A quote from another paper of the same author in

“At its core (i.e. as a purely mathematical theory), quantum mechanics can be regarded as a non-classical probability calculus resting upon a non-classical propositional logic. More specifically, in quantum mechanics each probability-bearing proposition of the form “the value of physical quantity A lies in the range B” is represented by a projection operator on a Hilbert space H. These form a non-Boolean — in particular, non-distributive — orthocomplemented lattice. Quantum-mechanical states correspond exactly to probability measures (suitably defined) on this lattice”.

Thanks again for the discussion.
And to another topic:
I started reading “Gina Says” yesterday – it’s HILARIOUS. I’m enjoying every minute.
Having joined the popular/scientific blogging scene recently I always get a kick out of how religious science discussions tend to be. (And on a personal level – I sympathize with Gina so much – GIRL POWER! )

The existence of actual “randomness” in the truth has no logical foundation. We can only observe “apparently unpredictable behaviour”. The same is true in quantum-level behaviour, where we saw some people with big egos berate Einstein to satisfy those egos and come up with outlandish and wholly illogical explanations for behaviour they had no idea why was happening, but just couldn’t bring themselves to admit it, as a result of those egos. The “uncertainty principle” is a case in point: it is a blindingly obvious statement about unpredictability in the observations (the “momentum” it refers to is about CURRENT TRAJECTORY, nothing to do with the past, as Heisenberg himself admitted). Every other purported “explanation” for the behaviour reduces simply to “I HAVEN’T GOT A CLUE WHY IT HAPPENS”. Of course “apparently unpredictable” observations do not prove the existence of randomness in nature, and never can. Indeed the existence of randomness in the truth has zero logical foundation.

“An electron is not a billiard ball, and it’s not a crest and trough moving through a pool of water. An electron is a mathematically different sort of entity, all the time and under all circumstances, and it has to be accepted on its own terms.

The universe is not wavering between using particles and waves, unable to make up its mind. It’s only human intuitions about QM that swap back and forth. The intuitions we have for billiard balls, and the intuitions we have for crests and troughs in a pool of water, both look sort of like they’re applicable to electrons, at different times and under different circumstances. But the truth is that both intuitions simply aren’t applicable.”

The presumed randomness in the particle world (as opposed to the appearance of randomness due to limitations in our knowledge about the players in a quantum event) only appears so because we incorporate time when we consider causal relationships, and we consider causal relationships when we dissect nature into smaller and smaller segments (a reductionist approach). If we entertain the perspective that the cosmos is a holistic entity inclusive of all time that is tantamount to “now,” then events that appear random, such as, for example, a particle moving here instead of there in a given instant or a radioactive particle emitted at a certain precise moment instead of another moment, are actually not random but are, instead, merely following the “blueprint” reflective of the nature of an unseen, incomprehensible, timeless whole.