The ambiguity of information

We are very confused about the meaning of the word "information." And that’s for two good reasons.

First, it’s a really important word, and important words are almost always stretched to the ripping point as they struggle to cover topic after topic after topic. (Perhaps this indicates that topics are more different, and less susceptible to uniform explanations, than we think.)

Second, the word "information" became important because a particular genius—Claude Shannon—took it out of everyday parlance and used it in a very different way in his theory. It’s as if Einstein had used the word "deliciousness" instead of "relativity," so now when we talk about Swiss Chocolate Almond ice cream we’re not sure if ...

OK, skip the analogy. The point is that Shannon gave "information" a mathematical, probabilistic sense that had little to do with what we’d meant by the term before that. His theory was so powerful that it got applied to everything from DNA to black holes, and thus the term "information" got spread around and intermingled with the ordinary sense, along with the communication theory sense and the computer science sense, until it became a hodge-podge word. It has some precise meanings within particular limited fields, but if you try to define the term as it’s used in the phrase "The Information Age," I bet you can’t in a way that covers everything we mean by it.

Throughout the Information Age, however, the term has also retained its original meaning. That meaning is hard to pin down, too, but only in the usual way that words escape their definitions. Its normal sense hasn’t changed much in the past 150 years. In fact, the memoirs of Charles Babbage are a good place to use as a source. Babbage is the English inventor and mathematician who is credited with designing gear-based computers, starting in the 1820s, that anticipated the modern computer with eerie precision. In fact, that’s a very bad misreading of Babbage, in my opinion, but that’s a different hobby horse to ride.

In his memoirs, written in the 1860s, Babbage uses the word "information" 28 times. In most of those instances, he means something quite ordinary, such as when he says he asked some young classmates how to invoke the devil, and they gave him that information. We can get all twisted up in trying to figure out what are the defining characteristics of the class of statement called "information," but it’s really much simpler than that. In most of those instances, Babbage means information to be simply something about the world that he did not know before. And that remains one of our usual senses of the term as well.

But a second meaning shows up in Babbage’s memoirs. For example, to help the British railway system decide what the distance between the rails should be, Babbage set up a metering system to measure the sway of railway coaches. The data that his instruments produced he casually refers to as "information." And that does indeed refer to a special class of knowledge: what fits into a table.

The importance of table-based information cannot be over-emphasized. Before computers, tables of numbers were crucial to applying mathematics. If you wanted to know the angle at which to aim your artillery, you had to look it up in a table. Galileo himself created and sold artillery tables. In fact, tables could themselves be an instrument of computation. For example, in 1684, Edmond Halley noticed a pattern of recurrence in the appearance of a comet. Yet it didn’t come back as regularly as it should. Halley thought that perhaps this was because of the subtle gravitational forces of the planets. But he couldn’t figure out how to calculate that, so he went to Newton himself. The "three body problem" was too hard, Newton said, demurring. It took three French aristocrats spending an entire summer filling in a table, manually calculating the gravitational forces effect on the comet at step-by-step intervals, to confirm that Halley’s comet was indeed a single heavenly object.

But there was a problem with tables. Because they were created by humans, they were error prone. They’d issue errata sheets, and then errata sheets for the errata sheets. In fact, the French tables that predicted Halley’s comet’s return only worked because—it was discovered later—the copious errors canceled one another out. Jonathan Swift declared mathematics to be a dim science precisely because we would never get the tables right.

But Swift didn’t count on Adam Smith. When the French government, after the Revolution, ordered new logarithmic and trigonometric tables created to reflect the new metric system, Gaspard de Prony used Smith’s description of the division of labor to structure the process. De Prony broke down the task of computing tables into a few simple steps, most of which could be performed by workers who only had to know how to do basic math. In fact, many of the people he hired were former hairdressers to aristocrats who had lost their hair because they had lost their heads. De Prony manufactured tables the way factories manufactured pins, he said. Babbage explicitly characterizes his own mechanical computers the same way: They were intended to be factories for the complete and error-free manufacturing of tables. (David Alan Grier’s When Computers Were Human is a terrific history of tables.)

Tables-based information is familiar to us. Tables pare a topic down to a simple set of repeating parameters. They are designed for fast retrieval. They are unambiguous. They are intensely useful. They are basically what we find in computerized databases. But if information stayed that simple, we wouldn’t be calling this the Information Age. For that, we had to take information as standing for something far more important. In the Age of Information, information becomes the very stuff of consciousness and even (according to some) of physics. Tables are too humble to carry such a burden.

So, we know what information was. We know what it still is, within circumscribed areas. But this is the Information Age, not the Age of Electronic Tables. Information means something much more than what it meant to Babbage. The problem is that we have an entire age named after it, and we still can’t say what it is. We have reconstructed our understanding of our world and ourselves using a term we don’t understand and don’t agree about.