Thursday, February 19, 2015

If you're a long-time Formula 1 fan, then the chances are that you believe the sport was better in the past. However, the chances are that you will have also read arguments from younger journalists and fans, to the effect that Formula 1 in the modern era is better than it was in the past.

Fortunately, there is an objective means to resolve this dispute: churn.

In sport, churn provides a straightforward measure of the uncertainty of outcome. Churn is simply the average difference between the relative rankings of the competitors at two different measurement points. One can measure the churn at an individual race by comparing finishing positions to grid positions; one can measure the churn from one race to another within a season by comparing the finishing positions in each race; and one can measure the inter-seasonal churn by comparing the championship positions from one year to another.

The latter measure provides an objective means of tracking the level of seasonal uncertainty in Formula 1, and F1 Data Junkie Tony Hirst has recently compiled precisely these statistics, for both the drivers' championship and the constructors' championship, (see figures below). In each case, Hirst compiled the churn and the 'adjusted churn'. The latter is the better measure because it normalises the statistics using the maximum possible value of the churn in each year. The maximum can change as the number of competitors changes.

The results for the drivers' championship indicates that churn peaked in 1980. Given that the interest of many, if not most spectators, is dominated by the outcome of the drivers championship, this suggests that Formula 1 peaked circa 1980.

The results for the manufacturers' championship are slightly different, suggesting that uncertainty peaked in the late 1960s, (although the best-fit line peaks in the middle 1970s).

One could, of course, make the alternative proposal that the churn within individual races is more important to spectators' interest, but at the very least we now have an objective statistical measure which provides good reason for believing that Formula 1 was better in the 1970s and early 1980s.

Monday, February 16, 2015

In James Lovelock's 2006 work, The Revenge of Gaia, he concludes the chapter entitled What is Gaia? with a description of the regulator in James Watt's steam engine, and the following argument:

"Simple working regulators, the physiological systems in our bodies that regulate our temperature, blood pressure and chemical composition...are all outside the sharply-defined boundary of Cartesian cause-and-effect thinking. Whenever an engineer like Watt 'closes the loop' linking the parts of his regulator and sets the engine running, there is no linear way to explain its working. The logic becomes circular; more importantly, the whole thing has become more than the sum of its parts. From the collection of elements now in operation, a new property, self-regulation, emerges - a property shared by all living things, mechanisms like thermostats, automatic pilots, and the Earth itself.

"The philosopher Mary Midgley in her pellucid writing reminds us that the twentieth century was the time when Cartesian science triumphed...Life, the universe, consciousness, and even simpler things like riding a bicycle, are inexplicable in words. We are only just beginning to tackle these emergent phenomena, and in Gaia they are as difficult as the near magic of the quantum physics of entanglement."

Now Lovelock is an elegant and fascinating author, but here his thought is lazy, sloganistic and poorly-informed. There are multiple confusions here, and such confusions are endemic amongst a number of writers and journalists who take an interest in science, so let's try and clear them up.

Firstly, we encounter the slogan that a system can be 'more than the sum of its parts'. Unfortunately, the authors who make this statement never seem to conjoin the assertion with a definition of what they mean by the phrase 'sum of its parts'. Most scientists would say that the sum of the parts of a system comprises the parts of the system, their properties, and all the relationships and interactions between the parts. If you think that there is more to a whole system than its parts, their properties and the relationships between the parts, then that amounts to a modern form of vitalism and/or dualism, the notion that living things and/or conscious things depend upon non-physical elements. Calling it 'emergentism' is simply a way of trying to dress up a disreputable idea in different language, rather in the manner than creationism was re-marketed as 'intelligent design'.

Assertions that a system can be more than the sum of its parts are frequently combined with attacks on so-called 'reductionistic' science. Anti-reductionistic authors can often be found pointing out that whole systems possess properties which are not possessed by any of the parts of which that system is composed. However, if such authors think this is somehow anti-reductionistic, then they have profoundly mis-understood what reductionistic science does. Scientists understand that whole systems possess properties which are not possessed by any of the parts; that's precisely because the parts engage in various relationships and interactions. A primary objective of reductionistic science is to try and understand the properties of a whole system in terms of its parts, and the relationships between the parts: diamond and graphite, for example, are both composed of the same parts, (carbon atoms), but what gives diamond and graphite their different properties are the different arrangements of the carbon atoms. Explaining the different properties of carbon and diamond in terms of the different relationships between the parts of which they are composed is a triumph of so-called 'reductionistic' science.

The next confusion we find in Lovelock's argument is the notion that twentieth-century science was somehow linear, or Cartesian, and non-linear systems with feedback somehow lie outside the domain of this world-view. Given the huge body of twentieth-century science devoted to non-linear systems, this will come as something of surprise to many scientists. For example, in General Relativity, (that exemplar of twentieth-century science), the field equations are non-linear.
Lovelock might even have heard the phrase 'matter tells space how to
curve, and space tells matter how to move'; a feedback cycle, in other words! Yet General Relativity is also a prime exemplar of determinism: the state of the universe at one moment in time uniquely determines its state at all other moments in time. There is clearly no reason to accept the implication that cause-and-effect must be confined to linear chains; non-linear systems with feedback are causal systems just as much as linear systems.

It's amusing to note that Lovelock concludes his attack on so-called 'Cartesian' science with an allusion to quantum entanglement. Clearly, quantum entanglement is a product of quantum physics, that other exemplar of twentieth century physics. So, in one and same breath, twentieth century science is accused of being incapable of dealing with emergentism, yet also somehow yields the primary example of emergentism.

Authors such as Lovelock, Midgley, and their journalistic brethren, are culpable here of insufficient curiosity and insufficient understanding. The arguments they raise against twentieth-century science merely indicate that they have failed to fully understand twentieth-century science and physics.