John W. Backus (1924 - 2007)

The man who for all practical purposes invented the modern concept of the interpreted programming language, and who gave us the first principles upon which the software industry was formed, died over the weekend at age 82. John W. Backus led the Applied Science Division of IBM's Programming Research Group, which in 1954 set about to develop a more sensible language that the IBM 704 could translate into its own machine language.

FORTRAN was not the first programming language; indeed, there were symbolic constructs prior to FORTRAN whose purpose was to represent machine instructions with more tangible, teachable concepts. Backus himself had developed one such precursor: the so-called "Speedcoding System" for the IBM 701, whose project was launched and completed in 1953.

Backus was an intellectual - a gifted thinker, but by no means a mathematician. In another age, without access to a computer - even one that filled the basement of a large building - he might have been a philosopher. Among his library at the time were the works of the modern philosopher and theorist Noam Chomsky, who studied the evolution of the human intellect and of written and spoken language in parallel. Chomsky was developing a symbolic syntax with which to frame his concepts of languages within languages, in the study of how sociology affects grammar. Backus borrowed some of Chomsky's concepts, including the idea that a symbology could represent a computer language...even one that didn't yet exist.

It was a concept three or four steps ahead of his time, way out of sequence, and which triggered a fast-forward process in the evolution of computer science that has since never been duplicated: the concept of a pattern representing an algebraic procedure that could not only be reconstituted into one computer's machine language, but also into another computer's machine language using a different process: a meta-language, Chomsky and Backus both called it.

This meta-language was not FORTRAN but ALGOL, the algorithmic grammar that was the direct forerunner of all the procedural languages to follow for the next three decades, including BASIC, Pascal, COBOL, and C. If one needed to conceive the basis for a working program, he could write the fundamentals in ALGOL first. Later, someone would coin the term pseudo-code to describe these frameworks.

Because ALGOL was not a real programming language in 1957 (although later derivatives borrowing the ALGOL name were developed as interpreters), it was a very difficult thing for associations of programmers to pitch to potential backers as something worth their monetary investment. But it was the only real candidate for a universal standard, something that the dozens upon dozens of computers the world was building could embrace as common ground.

For ALGOL to develop, just like software projects today, it needed a source of revenue. Universities were the ones who owned all the computers, so the idea of an international standard for programming was not all that interesting to deans and presidents whose heads were wound around the idea of competition, not cooperation. So the Association for Computing Machinery had a brilliant plan: It would present ALGOL to what was then the only global legislative body for international standards to be determined - literally the United Nations - but they had to give it a certain spin, because the UN didn't fund math.

The spin the ACM would come up with was born not so much out of Backus' methodology but of Backus' philosophy: that ALGOL was a communications system. Digital communications was a glorious new frontier for which there were no principal stakeholders yet. And ALGOL borrowed words and grammar from English - which was, after all, the international language of airplane pilots and amateur radio operators.

Thus when the UN-sponsored conference ratified ALGOL as an international standard, it inadvertently - and perhaps without a total understanding of what it was doing - ratified a burgeoning new concept: that software running in one process could communicate with software running in another process, using the formulaic concepts first proposed in ALGOL and first tried in FORTRAN, with the subroutine.

In a 1979 interview with IBM's in-house magazine Think, Backus - who had just received a Turing award - explained that he had not yet invented the thing for which he was credited, the automated programming language. FORTRAN represented a major step along the way, he said, especially by having introduced the concept of loops - repeated code, which led to modular code and then to intercommunicative code. But his next step would take advantage of a then-revolutionary concept that today we only think we take for granted: the surplus of system memory, which now measures in the gigabytes.

Here is an extended excerpt of Backus' comments from that 1979 interview:

Because it takes pages and pages of gobbledygook to describe how a programming language works, it's hard to prove that a given program actually does what it's supposed to. Therefore, programmers must learn not only this enormously complicated language but, to prove their programs will work, they must also learn a highly technical logical system in which to reason about them.

Now, in the kinds of systems I'm trying to build, you can write a program as essentially an equation, like equations in high school algebra, and the solution of that equation will be the program you want. What's more, you can prove your programs in the language of the programs themselves. The entire language can be described in one page. But there's a catch: They're what I call applicative languages, which means there's no concept of a stored memory at all.

...What I want to do is to come up with a computing system that doesn't depend on a memory at all, and combine that system, in a rather loose fashion, with one that has a memory but keeps the simplicity and the algebraic properties of the memory-less system. Then, hopefully, the process of algebraically speeding up programs can be mechanized so that people can write the simplest programs and not have to care whether they are efficient or not. The computer will do the hard work. And more than that, perhaps a lot of programs can be written simply by describing the program you want with an equation.

What Backus would later come to realize is that the era of ubiquitous memory would trigger the onslaught of the monolithic application - programs with ever larger memory requirements though whose functionality often did not grow in tandem. The modern age shares one thing in common with Backus' intentions: Mathematical efficiency is no longer the principal goal of computing, and hasn't been for quite some time.

Nonetheless, for three decades after John Backus gave the world the formula translator and the framework to comprehend it, the dream of a computer-literate society involved the comprehension of languages, of communication - representing ideas as symbols. Where we find ourselves today isn't exactly that place, though it's difficult to say for certain that we divulged onto the wrong track. But in searching for a way forward today, we could do well to revisit some of the fundamental principles of procedural logic which Backus made real for us. Nothing about the framework he produced in the early 1950s, and to which he devoted the remainder of his career, is outmoded or antiquated. Indeed, that was his intent: to find something timeless in the practice of computing. He was successful in this goal at a very young age, and for the latter half of his life, he took pride in that accomplishment. Our recognition of this should make us both proud and perhaps a little envious, even now.