BITS

The basis of computing, a BIT is a unit of information with a binary attribute, i.e two possible states: on/off, +/-, true/false, yes/no, or, most commonly, in binary notation, 1 or 0. It is in this sense, the ones and zeros represented by tiny charges in silicon chips, the building blocks of modern information technology, that the BIT is ubiquitous.

The modern binary system was codified by the philosopher Leibniz in the 17th Century. Leibniz was a polymath whose wide ranging interests included law, history and linguistics, and who invented calculus independently of Newton (his notation is still the standard in maths). But it his work on logic and the binary system which has been of fundamental importance to the modern world of computing, and thus to digital imaging.

Interestingly, Leibniz was fascinated by the orient and had studied the Chinese I Ching and its hexagrams that map to the binary numbers 0 to 111111. Perhaps Yin and Yang can be considered the oldest conceptually binary attributes.

Using Leibniz’s mechanisms and logic to perform mathematical calculations.

Leibniz used his system to invent a mechanical calculating machine he catchily called the “stepped reckoner”. Way ahead of its time it challenged the engineers of the day in its mechanical complexity, but two were eventually built. The mechanism Leibniz conceived for the reckoner was the blue-print of mechanical calculating machines right up to the Curta Calculator (see pic) which was developed in 1948 and was in use in some spheres into the 1980s before electronic calculating devices became reliable in adverse conditions. Rally teams were late users, and I’m sure they would have been seen on some film sets!

Leibniz was also one of the great 17th century rationalists along with Descartes and Spinoza, and in philosophy is remembered mainly for his optimism, the archetype of Voltaire’s Dr Pangloss who believed “everything is for the best in this, the best of all possible worlds”. What a guy.

The concept of the BIT and the binary system was taken forward by an englishman George Boole in the mid 18th century. Using the binary system to develop his ideas about logic he devised BOOLEAN ALGEBRA, the basis of modern computational computing.