A Revised Attack on Computational Ontology

Abstract

There has been an ongoing conflict regarding whether reality is fundamentally digital or analogue. Recently, Floridi has argued that this dichotomy is misapplied. For any attempt to analyse noumenal reality independently of any level of abstraction at which the analysis is conducted is mistaken. In the pars destruens of this paper, we argue that Floridi does not establish that it is only levels of abstraction that are analogue or digital, rather than noumenal reality. In the pars construens of this paper, we reject a classification of noumenal reality as a deterministic discrete computational system. We show, based on considerations from classical physics, why a deterministic computational view of the universe faces problems (e.g., a reversible computational universe cannot be strictly deterministic).

Appendix

In order to make our analysis above more self-contained and accessible for readers of different backgrounds, we provide below a brief summary of some of Popper’s arguments, which are relevant for our discussion. He argued that most physical systems are indeterministic (Popper 1950a, b). Popper equated (physical) indeterminism with the doctrine that “not all events are ‘determined’ in every detail” (1950a, p. 120). Conversely, determinism was taken to be the doctrine that all events are determined, without exception, whether future, present or past. By “determined events” he meant events that are “predictable in accordance with the methods of science” (ibid). The unpredictability of events under consideration is such that it cannot be “mitigated by the predictability of their frequencies” (ibid, p. 117, italics added). This account of determinism makes it scientifically refutable.

Moreover, the predictability of events is, according to Popper, a physical impossibility. An unpredictable observable event may still be correctly described fortuitously. So the predictability of such event is not logically impossible, but rather physically impossible by means of the rational methods of prediction in physics. These methods include the acquisition of initial information by observation (ibid, pp. 117–118). Popper showed that in an important sense all scientific predictions are deficient even from the perspective of classical physics.

Furthermore, Popper suggested that the deterministic character of classical Newtonian mechanics is illustrated by the story of the Laplacean demon, a superhuman omniscient entity (ibid, p. 122). If all the natural laws were in the form of equations, which uniquely determine the future from the present state, then by having a perfect knowledge of the initial state of the world (the initial information) and using mathematical deduction this demon would be able to predict every future state of the world. This kind of predictability is arguably deterministic, for it implies that given the foreknowledge of any future state (based on the initial information), all future states must be determined now, because past, present and future states are all necessarily connected.

To ground this nonphysical demon in a physical realm, Popper proposed to replace it with a calculating predicting machine—Predictor (ibid, p. 118). Predictor (which is in fact a computer) was designed according to the laws of classical physics so as to produce permanent records of some type (say, a write-once TM tape) that can be interpreted as predictions of the positions, velocities, and masses of physical particles. Popper argued that Predictor could never fully predict every one of its own future states, and the part of the world with which it interacts. He showed that either no such Predictor could exist in the physical world or its future states could not be predicted by any existing Predictor (ibid, p. 119).

The crux of Popper’s Predictor argument is that just as indeterminism in quantum physics is related to the measurement problem, in classical physics the interaction of Predictor with the system it measures (possibly itself) results in a similar indeterminism. If Predictor B measures another Predictor A, then B amplifies the signals from A. When another Predictor C measures the system A + B, C must also interact and amplify the signals from A + B. It is further assumed that B must also measure C and that, Popper argued, leads to the breakdown of the “one way membrane” between B and C and with it the conditions for successful predictions. None of these Predictors can have knowledge of its own state before that state has passed. Each Predictor can obtain information about its own state only either by studying the results obtained by another Predictor or by being given these results (ibid, pp. 129–130). Also, he showed that there could not be an infinite series of Predictors, such that the nth Predictor is superior to its predecessors. Only on the assumption that for every Predictor P there exists some P+ that is not only superior to P but also undetectable by P does the finite determinist doctrine hold (ibid, pp. 131–133).

Another version of the Predictor argument was based on a variation on Russell’s Tristram Shandy paradox. Tristram Shandy attempts to narrate his full autobiography and in so doing he spends more time on the description of the details of every event than the time it took him to live through it. His autobiography, accordingly, rather than reaching a state of being “up to date” with present time, becomes more and more out of date. Even if Tristram Shandy is arbitrarily fast in narrating the full description of his history, he must be incapable of bringing it completely up to date (Popper 1950b, p. 174).

Popper proposed another Predictor (call it TP) that is endowed with a memory in which results of its calculated predictions and the initial information received are stored. TP receives accurate and complete information about its state at time T0 and is tasked to predict some future state at time Tn. For every physical machine, there is a maximum running speed and as a result a minimum length of time needed for completing even the shortest description of which that machine is capable. Therefore, it cannot be simply assumed that the series of time intervals between T0 and Tn when TP attempts to perform its prediction task converges. TP must retain in memory not only records of the final predictions, but also intermediate partial results of its calculations. Any description will take at least as much memory space as the description of the state to be predicted. Since the memory space, which can be used before Tn has elapsed, is finite, the description of TP’s memory cannot be completed before Tn regardless of TP’s speed (Popper 1950b, pp. 175–177).

The last argument presented here is Popper’s Oedipus Effect, according to which prediction can influence the predicted event (ibid: pp. 188–190). The point is that the receipt of complete information about its immediate past by a Predictor C will ultimately change its future state, since C is designed to act upon the informative signals received. This self-information qualifies as a strong interference with the working of C. Still, some very superior C+ may foresee the future state change caused by C receiving the information, and give C inaccurate information about C’s state ingeniously designed to induce C to make correct predictions about itself. However, no finite piece of information can be precise self-information. For the finite self-information must contain a description of itself and this is impossible, as there cannot be a bijection from a finite data set S to a smaller subset of S.