Open Data

A New Kind of Science

A New Kind of Science is a best-selling, [1] controversial book by Stephen Wolfram , published by his own company in 2002. It contains an empirical and systematic study of computational systems such as cellular automata . Wolfram calls these systems simple programs and argues that the scientific philosophy and methods are appropriate for the study of other fields of science.

Contents

Computation and its implications

The thesis of A New Kind of Science ( NKS ) is twofold: that the nature of computation must be explored experimentally, and that the results of these experiments have great relevance to understanding the physical world . Since its nascent beginnings in the 1930s, computation has been primarily approached from two traditions: engineering , which seeks to build practical systems using computations; and mathematics , which seeks to prove theorems about computation. However, as recently as the 1970s, computing has been described as being at the crossroads of mathematical, engineering, and empirical traditions. [2] [3]

Wolfram Introduces a third tradition That seeks to empirically Investigate computation for icts own sake: He Argues That year Entirely new method is needed to do so Because traditional mathematics fails to meaningfully describe complex systems , and That there is an upper limit to complexity in all systems . [4]

Simple programs

The basic subject of Wolfram’s “new kind of science” is the study of simple abstract rules-essentially, elementary computer programs . In a very simple case of a computational system, one of the following is very simple. set of rules). This seems to be true of the components of the system and the details of its setup. Systems explored in the book include, among others, cellular automata in one, two, and three dimensions; mobile automata ; Turing machines in 1 and 2 dimensions; several varieties of substitution and network systems; primitive recursive functions; nestedrecursive functions ; combinators ; tag systems ; register machines ; reversal-addition . For a simple program, there are several requirements:

Its operation can be completely explained by a simple graphical illustration.

It can be completely explained in a few sentences of human language .

It can be implemented in a computer language using just a few lines of code.

The number of possible variations is small enough that all of them can be computed.

Generally, simple programs tend to have a very simple abstract framework. Simple cellular automata, Turing machines, and combinateurs sont exemples de ces frameworks, while more complex cellular automata. It is also possible to invent new frameworks, particularly to capture the operation of natural systems. The remarkable feature of simple programs is that they are capable of producing great complexity. Simply enumerating all possible variations of quickly and unexpectedly. This leads to the question: if the program is so simple, where does the complexity come from? In a sense, it is not enough room in the program to define everything directly in the program. Therefore,emergence . A logical deduction from this phenomenon is that the details of the program have little direct relationship to its behavior, then it is very difficult to directly engineer a simple program to perform a specific behavior. An alternative approach to a simple overall computational framework, and then a brute-force search through all of the possible components for the best match.

Simple programs are capable of a remarkable range of behavior. Some have been proven to be universal computers . Examples of thermodynamic behavior, continuum behavior, conserved quantities, percolation , sensitive dependence on initial conditions , and others. They-have-been used as models of traffic , material fracture, crystal growth , biological growth, and various sociological , geological , and ecologicalphenomena. Another feature of simple programs is that, according to the book, they are more likely to have an effect on their overall complexity . A New Kind of Science argues that this is evidence that simple programs are enough to capture the essence of almost any complex system .

Mapping and mining the computational universe

In this case, Wolfram argues that it is necessary to systematically explore all of these computational systems and document what they do. He further argues that this study should become a new branch of science, like physics or chemistry . The basic goal of this field is to understand and characterize the computational universe using experimental methods.

The proposed new branch of scientific exploration admits many different forms of scientific production. For instance, qualitative classifications are often the results of initial forays into the computational jungle. It is also known that certain systems are eligible. There are also some forms of production that are unique in this field of study. For example, the discovery of computational mechanisms that emerge in different systems.

Another kind of production involves the creation of programs for the analysis of computational systems. In the NKSframework, these should be simple programs, and subject to the same goals and methodology. An extension of this idea is that the human mind is itself a computational system, and provides an effective way to research. Wolfram believes that these programs and their analysis should be viewed as directly as possible, and exhaustively examined by the thousands or more. Since this new field concerns abstract rules, it is possible to address other issues of science. In general, however, they can be discovered in the computational universe, where they can be represented in their simplest forms, and then other fields can be discovered.

Wolfram has made a great lesson in the world of science, and it is important that there is a lot of things going on in the world. ‘mine’ and harness for our purposes. ” [5]

Systematic abstract science

While Wolfram advocates simple programs as a scientific discipline, he also argues that its methodology will revolutionize other fields of science. The basis of his argument is that the study of simple programs is the minimum possible form of science, grounded in both abstraction and empirical experimentation. Every aspect of the methodology is in the process of being tested , and it is possible to maximize the chances that the experiment will do something unexpected. Just as this study provides computational mechanisms to be studied in their simplest forms, Wolfram argues that the process of doing so engages with the mathematical basis of the physical world, and therefore has much to offer the sciences.

Wolfram argues that the computational realities of the universe makes science hard for fundamental reasons. But he also argues that by understanding the importance of these realities, we can learn to use them in our favor. For instance, instead of reverse engineering our theories from observation, we can enumerate systems and then try to match them to the behaviors we observe. A major theme of NKSis investigating the structure of the space. Wolfram argues that science is far too ad hoc, in part because the models are too complicated and unnecessarily organized around the limited primitives of traditional mathematics. Wolfram advocates using models whose variations are enumerable and whose consequences are straightforward to compute and analyze.

Philosophical underpinnings

Computational irreducibility

Wolfram argues that one of his achievements is in providing a coherent system of ideas that justifies computation as an organizing principle of science . For instance, it argues that the concept of computational irreducibility is oneof the reasons why computational models of the world are considered in addition to traditional mathematical models . Likewise, his idea of ​​intrinsic randomness generation-that natural systems may be more or less random, rather than chaos or stochastic disturbances-implies that computational models do not need to include explicit randomness.

Principle of computational equivalence

Wolfram developed the principle of computational equivalence ( PCE ): the principle states that systems found in the natural world can perform computations up to a maximum (“universal”) level of computational power . Most systems can attain this level. Systems, in principle, computes the same things as a computer. Computation is simply a question of translating input and outputs from one system to another. Consequently, most systems are computationally equivalent. Proposed examples of such systems are the workings of the human brain and the evolution of weather systems.

The principle can be restated as follows: almost all processes are of equal sophistication. From this principle, Wolfram draws an array of concrete deductions which he argues reinforce his theory. Possibly the major MOST Among thesis is an explanation as to why we experience randomness and complexity : Often, the systems analysis we are just as sophisticated as we are. Thus, complexity is not a special quality of systems, like for instance the concept of “heat,” but simply a label for all systems whose computations are sophisticated. Wolfram argues that this understanding makes the “normal science” of the NKS paradigm possible.

At the deepest level, Wolfram argues that-like many of the most important scientific ideas-the principle of computational equivalence permits to be more general by pointing out new ways in which humans are not “special”; that is, it has been claimed that the complexity of human intelligence is In a sense, many of Wolfram’s ideas are based on understanding the scientific process-including the human mind-as operating within the same universe it studies, rather than being outside it.

Applications and results

There are a number of specific results and ideas in the NKS book, and they can be organized into several themes. One common theme of examples and applications is demonstrating how to be more efficient and easier to understand.

First, there are several cases where the NKS book introduces what was, during the book’s composition, the simplified known system in some class that has a particular characteristic. Some examples include the first primitive recursive function that results in complexity, the smallest universal Turing Machine , and the shortest axiom for propositional calculus . In a similar vein, Wolfram also demonstrates many simple transitions , conserved quantities , continuum behavior, and thermodynamics that are familiar from traditional science. Simple computational models of natural systemsshell growth , fluid turbulence , and phyllotaxis are a final category of applications that fall in this theme.

Another common theme is taking facts about the computational universe and using them in a holistic way. For instance, Wolfram discusses the facts about the computational universe evolutionary theory , SETI , free will , computational complexity theory , and philosophical fields like ontology , epistemology , and even postmodernism .

Wolfram suggests that the theory of computational irreducibility can provide a resolution to the existence of a nominally deterministic universe. He posits que la computational processes in the brain of the being with free will is Actually complex enough so That It can not be captured in a simpler computation, due to the principle of computational irreducibility. Thus, while the process is indeed deterministic, there is no better way to be determined than, in essence, to run the experiment and let the being exercise it.

The book also contains a vast number of individual results-both experimental and analytic-about a particular automaton compues, or what it is, using some methods of analysis.

The book contains a new technical result in the Turing completeness of the Rule 110 cellular automaton. Very small Turing machines can simulate Rule 110, which Wolfram demonstrates using a 2-state 5-symbol universal Turing machine . Wolfram conjectures that a particular 2-state 3-symbol Turing machine is universal. In 2007, as part of commemorating the book’s fifth anniversary, Wolfram’s company offered a $ 25,000 prize for proof that this Turing machine is universal. [6] Alex Smith, a computer science student from Birmingham , UK, won the prize later that year by proving Wolfram’s conjecture. [7] [8]

NKS Summer School

Every year, Wolfram and his group of instructors [9] organizes a summer school. [10] The first four summer schools from 2003 to 2006 were held at Brown University . The University of Vermont at Burlington is hosted by the University of Vermont with the exception of the year 2009 that was held at the Istituto di Scienza e Tecnologie dell’Informazione of the CNR in Pisa, Italy . After seven consecutive summer schools more than 200 people have participated, some of whom have been developing their 3-week research projects as their Master’s or Ph.D theses. [11] Some of the research done in the summer school has resulted in publications. [12] [13] [14]

Reception

Mainstream Periodicals Gave A New Kind of Science Typically Broad [ citation needed ] coverage for a science book, including articles in The New York Times , [15] Newsweek , [16] Wired , [17] and The Economist . [18] Some scientists criticize the book as abrasive and arrogant, and perceived a fatal flaw-that of such complex cellular systems that are not complex enough to describe the degree of complexity of the present systems. complexity of systems. [19] [20]Wolfram’s claim to universal computation, Wolfram’s claim to a paradigm shift. Others found that the work contained valuable insights and refreshing ideas. [21] [22] Wolfram addressed his critics in a series of blog posts. [23] [24]

Scientific philosophy

A key tenet of NKS is that the simpler the system, the more likely a version of it will recur in a wide variety of more complicated contexts. Therefore, NKS argues that systematically exploring the space of simple programs will lead to a basis of reusable knowledge. However, many scientists believe that all possible parameters, only some actually occur in the universe. For instance, of all possible permutations of the symbols making up an equation, most will be meaningless. NKS has also been criticized for asserting that the behavior of some systems is somehow representative of all systems.

Methodology

A common criticism of NKS is that it does not follow established scientific methodology . For instance, NKS does not establish rigorous mathematical definitions, [25] nor does it attempt to prove theorems ; and most formulas and equations are written in Mathematica rather than standard notation. [26] Along these lines, NKS has also been criticized for being heavily visualized, with many information conveyed by pictures that do not have formal meaning. [22]It has been criticized for not using modern research in the field of complexity, particularly the works that have studied complexity from a rigorous mathematical perspective. [20] And it has been criticized for misrepresenting chaos theory : “Throughout the book, he equates chaos theory with the phenomenon of sensitive dependence on initial conditions (SDIC).” [27]

Utility

NKS has been criticized for not providing specific results that would be immediately applicable to ongoing scientific research. [22] There is also some implication, implicit and explicit, that the study of simple programs has little connection to the physical universe, and therefore is of limited value. Steven Weinberg has pointed out that no real world system has been explained using Wolfram’s methods in a satisfactory fashion. [28]

PCE

The Principle of Computational Equivalence has been criticized for being vague, unmathematical, and for not making verifiable predictions. [26] It has been criticized for being contrary to the spirit of research in mathematical logic and computational complexity theory, which seeks to make fine-grained distinctions between levels of computational sophistication, and for wrongly conflicting different kinds of universality property. [26] Moreover, critics such as Ray Kurzweil have argued that it ignores the distinction between hardware and software; while two computers may be equivalent in power, it does not follow that they are also equivalent. [19]Others suggest it is little more than a rechristening of the Church-Turing thesis . [27]

The fundamental theory ( NKS Chapter 9)

Wolfram’s speculations of a direction towards a fundamental theory of physics have been criticized as vague and obsolete. Scott Aaronson , Professor of Computer Science at the University of Texas Austin, also claims that Wolfram’s methods are not compatible with both special relativity and Bell’s theorem violations, and thus can not explain the results of Bell test experiments . [29] However, Aaronson’s arguments are only right and apply to the entire scientific field of Quantum gravitythat they are fundamentally flawed (eg under a non-local hidden variable theory of superdeterminism acknowledged by Bell himself [30] ), and even explored by eg physics Nobel laureate Gerard ‘t Hooft , [ 31] see also replies to criticism of digital physics . [32] [33] [ disputed – discuss ]

Edward Fredkin and Konrad Zuse pioneered the idea of ​​a computable universe , the formation of a world by machine, and later by Fredkin using a toy model called Salt. [34] It has-been Claimed That NKS tries to take thesis ideas have icts own goal Wolfram’s model of the universe if a rewriting network and not a cellular automaton as Wolfram himself HAS suggéré a cellular automaton can not account for relativistic features Such As no absolute time frame. [35] Jürgen Schmidhuber has also made his work on Turing machine -computable physicswas stolen without attribution, namely his idea on enumerating possible Turing-computable universes. [36]

In a 2002 review of NKS , the Nobel laureate and elementary particle physicist Steven Weinbergwrote, “Wolfram himself is a lapsed elementary particle physicist, and I suppose he can not resist trying to apply his experience to digital computer programs to the laws of nature. Richard Feynman) That he is discrete rather than continuous, he suggests that space consists of a set of isolated points, like cells in a cellular automaton, and that even times flows in discrete steps. It is possible, but it is possible, but it is possible for this speculation, except that it is the case that Wolfram and others have become used in their work on computers. So might a carpenter, looking at the moon, suppose that it is made of wood. “[37]

Nobel laureate Gerard ‘t Hooft has also recently suggested a cellular automaton-based unifying theory of quantum gravity as an interpretation of superstring theory where the evolution equations are classical, “[b] oth the bosonic string theory and superstring theory can be reformulated in terms of a special basis of states, defined on a space-time lattice with lattice length\ displaystyle 2 \ pi ” [31]

Natural selection

Wolfram’s claim That natural selection is not The fundamental causes of complexity in biology HAS LED non-scientist to journalist Chris Lavers state Wolfram That does not Understand the theory of evolution . [38]

Originality

NKS has been heavily criticized as

The authoritative manner in which NKS presents a vast number of examples and arguments has been criticized as leading the reader to believe that each of these ideas was original to Wolfram; [27] in particular, one of the most substantial new technical results presented in the book, which the rule is complete Turing complete , was not proven by Wolfram, but by his research assistant, Matthew Cook . However, the notes section of the book is made famous by these other scientists citing their names together with historical facts.

Additionally, it has been pointed out that the idea is very much in the art, particularly in chaos theory and complex systems . [20] Some have argued [ who? ] that the use of computer simulation is ubiquitous, and instead of starting a paradigm shift NKS just adds justification to a paradigm shift that has been undertaken. Wolfram’s NKS might then seem to be one of the books describing this shift.