Models of Computation

When Alan Turing thought of what it means for a task to be computable, he was actually modeling computation in the form of a state machine. And thus was born the Turing machine. It is not a physical machine per se, but an idealized model of computation. That model morphed into a formal statement that basically characterizes the nature of computation (the Church-Turing thesis).

Church, along with mathematician Stephen Kleene and logician J.B. Rosser created a formal definition of a class of functions whose values could be calculated by recursion.

All three computational processes (recursion, the λ-calculus, and the Turing machine) were shown to be equivalent—all three approaches define the same class of functions.This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes. Informally the Church–Turing thesis states that if some method (algorithm) exists to carry out a calculation, then the same calculation can also be carried out by a Turing machine (as well as by a recursively-definable function, and by a λ-function).

Turing machine led to the development of imperative programming languages (courtesy: Anthony Aaby)

Imperative programming is characterized by programming with a state and commands which modify the state.

Functional programming languages on the other hand are based on lambda calculus:

The lambda-calculus grew out of an attempt by Alonzo Church and Stephen Kleene in the early 1930s to formalize the notion of computability (also known as constructibility and effective calculability). It is a formalization of the notion of functions as rules (as opposed to functions as tuples). As with mathematical expressions, it is characterized by the principle that the value of an expression depends only on the values of its subexpressions.