On Evolution

A process atheist is someone who agrees that every question that used to be answered by appealing to God can be better answered by appealing to some form of evolution.

So you might wonder about the meaning of the term evolution.

Since the term evolution is abstract, it’s definition will be abstract: a process is evolutionary if and only if it increases complexity. Generally, this means that the complexity of the most complex things is increasing. Less complex things may still exist. This means that evolutionary processes build stratified layers of complexity – they build complexity hierarchies. Of course, the weight is now on the term complexity. And, fortunately, there are explicit ways to cash that out. Different types of evolution will obviously use different complexity metrics (and that, indeed, is exactly what makes them different types of evolution).

Within biological evolution, the arrow of complexity hypothesis states that: “the complex functional organization of the most complex products of open-ended evolutionary systems has a general tendency to increase with time.” (Bedau, 1998: 145) And biological evolution does support various arrows of complexity. You might say this is Kantian purposiveness without purpose. But it would be distracting to get into that. On to the metrics:

Bower says that the complexity of an organism is the number of distinct cell types it contains (1988: 101). He argues that evolution tends to increase the complexity of the most complex (species of) organisms. Adami et al. (2000) equate the complexity of organisms with the complexity of their genomes; they define the complexity of a genome to be the amount of information it encodes about the environment in which it has evolved. Generally speaking, this genomic complexity has always been steadily increasing.

Within chemistry, one might simply define the complexity of an element to be its number of protons. Within molecules, more structural definitions can be used. Over time, ever more complex elements have progressively appeared in our universe. Thus the complexity of the most complex elements has been increasing.

At the most general level, Chaisson says that the complexity of a system is “the rate at which free energy transits a complex system of given mass”; it is “the free energy rate density, alternatively called the specific free energy rate, expressed in units of energy per time per mass” (2001: 134). Chaisson shows – with impressive clarity – how the complexity of the most complex things have been steadily increasing.

Another way to look at physical complexity is to use Dennett’s levels (1991). He distinguishes between the physical, design, and intentional levels. The history of our universe started with just the physical level; design levels emerged (chemical and biological); and then intentional levels emerged (psychological, social). Dennett has also applies his levels to other types of universes like cellular automata. And, close to Dennett’s ideas, I’ll give a shout out to Jaker op Akkerhuis’s operator hierarchy (2008). (Though I admit I find Akkerhuis very hard to understand.)

One very general measure of complexity (and probably the best) is Bennett’s notion that complexity is logical depth (1988). The complexity of a structure is the amount of computational work required to generate the structure. This can be measured formally in terms of the run times of programs that generate the structures. For cosmological evolution, something like logical depth is a good measure. The process atheist says that cosmological evolution is increasing the logical depth of universes.

It’s interesting to note that logical depth maps very closely onto Leibniz’s notion of perfection. (And Leibniz, remarkably, even offered his analysis of perfection in terms of binary strings! I love Leibniz!) Leibniz offers a quantitative analysis of perfection: he says perfection is quantity of essence (1697: 86). Leibniz often says that perfection has two dimensions: it is a product of variety and order (Monadology, sec. 58; Theodicy, sec. 207; Discourse on Metaphysics sec. 6). Order is like algorithmic regularity and variety is like algorithmic randomness. Hence Leibniz’s concept of perfection is like logical depth.

Once we get out into the infinite, more powerful measures are needed. Kyburg (1961: 392-393) says that the complexity of a theory is measured by the number of quantifiers in the shortest version of the theory. Another and probably better approach is to use something like the Kleene-Mostowski hierarchy. Given any axiom system (any theory), expressed in the predicate calculus in prenex normal form, the complexity of the theory is the number of alternating blocks of the same type of quantifier. Thus the complexity of a universe is the complexity of the simplest theory of which the universe is a model. The process atheist says that this (or some similar) metric of complexity is steadily increasing as structures are produced one after another by metaphysical evolution.