The British mathematician Alan Turing (1912 - 1953) developed the theory of computers by designing a computing machine which could perform any logical operation but had only two instructions, Add and Shift (See Appendix 2). He then proved mathematically that this simple machine could perform any given computer operation - given enough time. This is the basic structure of modern computer architecture, so-called "Von Neuman" machines which read one instruction at a time, process it and then read the next instruction. Von Neumann machines have a memory in which data and instructions are stored. The CPU fetches information from the memory and either executes it (instructions) or processes it (data). When the CPU has completed an instruction it steps forward one step in memory and fetches the next instruction. Instructions and data are mixed together in memory and it is only the skill and discipline of the programmer which separates them. This is very similar in concept to Turing machines.

Another architecture is known as the Harvard Architecture. In this system instructions and data are completely separated. Data and instruction word widths can be different. For example the PIC microcontrollers from Microchip have 8 bit data paths and 12 bit instructions.

For convenience modern CPUs have many instructions and wider word widths. The more sophisticated instructions, such as divide, or move-string-of-data, take more than one clock cycle to complete, some single instructions may take hundreds of clock cycles to complete. operate slower (per instruction) but are more convenient to program. About the time that µP manufacturers were starting to put divide instructions onto the

CPU chips several companies proposed a return to simplicity and developed the Reduced Instruction Set Computer (RISC). This machine has a small instruction set and operates extremely quickly. Of course if you want to do more complex things you need more instructions and some of the speed advantage is lost. Overall the programs will run faster though, as each program will only have what it needs and no extra baggage.

The RISC philosophy is somewhat reflected in current software trends where a few years ago Pascal was the language of choice. Pascal can do many things such as adding arrays of numbers and other sophisticated functions. It also has many reserved words, operators and syntax rules that have to be obeyed. Pascal source code tends to be verbose. Run-time code tends to be large and slow unless you have an optimizing compiler. Most current Pascal compilers do some optimizing. Currently C is the most popular language. C has a very small kernel with few functions. For example it doesn't even have a "print" function built in. Print is available from a standard library of add on functions. C source code is often very terse (almost unreadable), and fast. There are very few rules restricting the programmer.

Note that RISC machines are not necessarily related to parallel computing. Parallel computing is often done on RISC machines but it doesn't have to be.

Turing machines are named after the British mathematician Alan M. Turing, who first proposed them as a way to define computation. In one sense, a Turing machine is the ultimate digital computing machine. It can compute anything that a modern computer can - as long as it is given enough time.

One can visualize a Turing machine as a black box equipped with a device that reads a symbol in a single cell of an infinitely long tape, writes a new symbol in the cell and moves the tape either forward or backward in order to examine the symbol in an adjacent cell. What is inside the black box? It does not really matter, as long as the box adheres strictly to a given table that lists every symbol read and for every one of the machine's possible "states." These may change with each cycle of operation. A cycle consists of the following three steps:

1. Read the symbol currently under the read/write device.

2. Look up the table entry given by the machine's current state and the symbol just read.

3. Write the symbol given by the table entry, move the tape in the direction indicated and enter the next state shown.

Each table entry therefore has three parts: a symbol to be written on the current cell, a direction in which to move the tape and a next state to enter.

To a Turing machine the tape's motion is relative. One could just as easily arrange for the tape to remain fixed and the machine to move itself from cell to cell.

Artificial Intelligence - The Turing Test

Another famous invention of Alan Turing's is the Turing test. This was proposed as a way of evaluating an artificial intelligence machine. The user (tester) sits at a terminal and engages in a dialogue with the machine. Anything can be entered at the terminal, questions, comments, math problems, written abuse etc. The user must decide from the responses whether he is communicating with a computer or a human being sitting at another terminal. When a computer can convince enough people that it actually is a human then the computer is said to have passed the Turing test.