Ada to Ziv: Names in Computers

June 23, 2012 would have been the 100th birthday of the British polymath Alan Turing. Among his achievements, Turing contributed substantially to the field of computers, and his name shows up multiple times in the lexicon of IT. Reflecting on this made me wonder who else I might find represented in the vocabulary of the field. Lots of people, it turns out.

Let's start with Turing. In 1936 (before computers), Turing formulated a hypothetical "automatic machine," now known as a Turing machine. Although a Turing machine isn't a practical design for an actual computer, it's a useful way to think about what sorts of things a machine can compute. If you happened to go to the Google home page on Turing's birthday, you might have seen that day's Google Doodle, which was a brilliant interactive model of a Turing machine:

Along these lines, a computer or computer language is said to be Turing complete if it can theoretically compute any algorithm. These days, not only is your desktop computer Turing complete, but so is your phone. Turing also left us the Turing test, which tries to distinguish humans from computers, as I mentioned in the article about CAPTCHAs.

Many computer languages are named for people. The language Ada honors Ada Lovelace, a 19th-century mathematician (and the daughter of Lord Byron) who worked with Charles Babbage on an early mechanical computer. The computer scientist Niklaus Wirth created two languages named for people: Pascal for Blaise Pascal, and Euler for the mathematician Leonhard Euler. The language Haskell is named for the American mathematician Haskell Curry, Erlang is named for the Danish mathematician Agner Krarup Erlang, and Gödel is named for the logician Kurt Gödel. There's even a language named for the artist M. C. Escher. (If you're familiar with Escher's work, you might be able to imagine why it would inspire computer scientists.) In a less formal process, the name of the operating system Linux arose as a combination of the originator's first name (Linus Torvald) and Unix, which was the name of the system he was copying.

The computer industry is rich in "laws" and principles, some of which you might know, that are named after people. In 1965, Intel's Gordon Moore observed that the capacity of computer chips seemed to double every year. Moore's Law, as it was eventually called, was prescient: although Moore only looked forward to 1970, Moore's Law has described computer chip technology for nearly 50 years, which enabled the kind of miniaturization that lets you carry a computer (i.e., a smartphone) in your pocket. Many people have probably heard, and possibly suffered, some variation of Brooks's Law ("Adding manpower to a late software project makes it later"), which comes from Fred Brooks's legendary 1975 book The Mythical Man-Month. Linus's Law, named in honor of Linus Torvald, says that "given enough eyeballs, all bugs are shallow," which is a kind of programmer version of the proverb "many hands make work light." An important design principle in software is Postel's Prescription, named for Jon Postel, which states that designs should be "conservative in what they send, liberal in what they accept," a maxim that sounds nearly Biblical in its wisdom. This principle has been critical for the development of the Internet; it's what makes it practical for us to request a web page that might live on an entirely different computer halfway around the world.

Certain programming techniques have their eponyms as well. George Boole was an English logician who invented Boolean algebra. The logic in digital computers ultimately just consists of head-spinningly complex chains of Boolean tests(on/off, true/false) and of Boolean operators(AND, OR, and NOT). A more obscure programming technique is currying, which I mention only because it's a second thing in computers, in addition to the programming language, that's named for Haskell Curry. There are names pretty well hidden in the LZ compression algorithm, named for Abraham Lemper and Jacob Ziv, which you and I take advantage of every time we use .GIF or .PNG files.

Programmers also frequently use a couple of near-eponyms, so to speak. If you write an addition problem as "+ 4 1" (add 4 and 1), you're using something called Polish notation, which is useful in certain computer contexts. The "Polish" in Polish notation refers indirectly to its inventor, the logician Jan Łukasiewicz. Reasonably enough, if instead you write an expression as "4 1 +", you're using Reverse Polish notation (RPN), which is useful in different computing contexts. In fact, some readers might be old enough to remember that RPN was actually how you used some old calculators and adding machines.

Another near-eponym in programming is Hungarian notation, where things are named using a prefix to suggest their function — for example, iCounter might be a counter that's an integer, and txtName might be a text box for entering a name. One theory is that Hungarian notation is named for its inventor, the Hungarian programmer Charles Simonyi. It might also be named for the fact that personal names in Hungarian are reversed (last name, then first name). Or a probably apocryphal story says it was named Hungarian because the prefixes looked strange — "like Hungarian." Maybe all three stories explain the name.

Most of these name-related terms aren't used much in everyday English, of course, although several (Turing test, Moore's Law, Linux) appear often in general-interest articles or books about computers. But it's pleasing that the everyday life of programmers does remember people who have contributed so much to our understanding of the field.

Comments from our users:

I've used RPN and had forgotten about that entirely. Although I didn't know at the time it was called RPN. That was when we were allowed to have calculators in the classroom, but we were only allowed to use them to check our work not to do our work.

A few more eponyms used in the computer industry, although not necessarily specific to it, are Andrew File System, Bezier curve, Bourne shell, Boyce-Codd Normal Form, Gantt chart, Gaussian Blur, Perlin Noise and possibly also Kerberos, named after the mythological figure, not to mention font names like Garamond, Bodoni etc.

You summed up all the concerned computer personnel here in an almost chronological order-that's very distinct. Though I am reading the column under the VT section, however the article deserve the full credit for accepting to a scientific journal.
The article is a true copy of the writer's computer knowledge, interest and passion to use the device in everyday life. The metaphors that you used to understand the complex logistic and the cozy descriptions for common people is an absolute cool. I was able to follow all the names mentioned here because I got familiar with those names when I was attending a graduate level computer introductory course.
Thank you so much for the presentation.

Hi, Licia -- thanks for your additions! Unfortunately, space constrained a really exhaustive list. I did consider BNF, but thought it might take some explanation. :-) I did not know the origin of Kerberos, thanks!

A few others I left out were:

* RSA encryption, named for its inventors (Rivest/Shamir/Adleman) and used widely in financial transactions.

* The Django web framework, named for the famous guitarist.

* Debian (a GNU distro), whose name was derived from the combination of Ian Murdock (its developer) and his then-girlfriend Debra Lynn.

* And Bluetooth, which some folks like to claim is named for the Danish king Harald Bluetooth. :-)

I was interested that a couple of names have NOT been used, surprisingly -- as far as I know, nothing's been named for Charles Babbage, and it's a little surprising that there's no programming language named for Grace Hopper, who contributed so much to the early development of compiled languages. (The group at Microsoft that promotes women in computer science is named "Hoppers," I'm pleased to say.)