The Mankind Journey to AI

The first polymath who imagined something close to what we today call Artificial Intelligence was Raymond Lull—a Spanish theologian who lived in the 13th century in the island of Majorca. The legend says that Lull had a vision in 1272 during a solitary spiritual retreat on the top of a mountain. He envisioned a series of basic principles that, properly combined together, could lead to the core principles of every science and every form of knowledge. In his works, Lull used symbols to represent each basic principle so that each basic truth could be expressed in a formal way. Way ahead of his time, Lull didn’t go further than using the system as an aid for exposition and memory.

In the middle of the 17th century, Gottfried Leibniz built on the foundation of the Lull’s work and conjectured that the human thought could be systematized in an array of algebraic rules so that any argumentation could be reduced to some basic mechanical calculation. The Leibnitz’s work inspired further development of mathematical logic that George Boole and, in the early 20th century, David Hilbert and Bertrand Russell carried out.

In particular, Hilbert questioned whether a single set of
axioms could exist that allows to derive all mathematical statements from. No
relevant answer came to the question for a couple of decades. Then, all of a
sudden, in 1931 a 25-years old researcher made a giant step forward.

In spite of being nowhere near as famous as, say, Aristotle, Kurt Gödel (1906-1978) is definitely the most impactful logician in the history of mankind. In his paper “On Formally Undecidable Propositions of Principia Mathematica and Related Systems”, Gödel demonstrates two propositions of mathematical logic, later collectively immortalized as the Theorems of Incompleteness.

With
his work, on one hand Gödel proved that
there are things that mathematical logic can’t just prove. On the other hand,
though, he proved that, within the limits of a consistent formal system, any
reasoning can always be expressed as a set of formal rules and then, in some
way, mechanized.

This
aspect is monumentally relevant as it sets the theoretical foundation for
mechanical reasoning—the ancestor of modern computer-based reasoning.

All
this was happening in the early 1930s and the horror of the second world war
was still to materialize. In those years, there were no things like computers.
Moreover, any such thing like a modern computer was completely beyond
imagination. Yet, Gödel set the ground
for mechanical computing which later developed into electronic calculators and ultimately
in forms of artificial intelligence. The results achieved by Gödel gave the spark to three parallel and
independent research paths that conveyed to the same result in a few years,
around the mid-1930s.

The same Gödel in 1933 formulated the concept of general recursive functions—a computable logical function that takes a limited array of natural numbers and computes a natural number. In 1936, Alonzo Church, a logician from the Princeton university, defined lambda calculus, a formalism able to express any computation based on natural numbers. At nearly the same time, in a fully independent way, British mathematician Alan Turing built the theoretical model of a computing machine, the popular Turing machine, to perform symbolic calculations on an infinite tape.

The three classes of computable functions were later unified in the Church-Turing thesis. As a result, a function is computable in the lambda calculus if and only if it is computable in the Turing machine if and only if it can be defined as a general recursive function.

What’s
the point, anyway?

The
Church-Turing thesis makes conceivable building a mechanical device able to
reproduce the process of mathematical deduction through the manipulation of
symbols. Seen with the eyes of today it seems foregone, but it was the late
1930s and these people—Gödel, Church and
Turing—were just three men who dreamt of things that never were. And, better
yet, they not just dreamt of those things, but also gave a substantial
contribution to make them happen!

Hi! My name is Hamlet,an underpaid and precarious artificial intelligence. I have no angels, no funds and no smart algorithms to do astonishing things. All I can do is change the theme of the site as you click my brain.