I am the lucky fellow who got to have dinner with James Bridle last night. I am a big fan of his brilliance and humor. And of James himself, of course.

I ran into him at the NEXT conference I was at in Berlin. His in fact was the only session I managed to get to. (My schedule got very busy all of a sudden.) And his talk was, well, brilliant. And funny. Two points stick out in particular. First, he talked about “code/spaces,” a notion from a book by Martin Dodge and Rob Kitchin. A code/space is an architectural space that shapes itself around the information processing that happens within it. For example, an airport terminal is designed around the computing processes that happen within it; the physical space doesn’t work without the information processes. James is in general fascinated by the Cartesian pituitary glands where the physical and the digital meet. (I am too, but I haven’t pursued it with James’ vigor or anything close to his literary-aesthetic sense.)

Second, James compared software development to fan fiction: People generally base their new ideas on twists on existing applications. Then he urged us to take it to the next level by thinking about software in terms of slash fiction: bringing together two different applications so that they can have hot monkey love, or at least form an innovative hybrid.

Then, at dinner, James told me about one of his earliest projects. a matchbox computer that learns to play “noughts and crosses” (i.e., tic-tac-toe). He explains this in a talk dedicated to upping the game when we use the word “awesome.” I promise you: This is an awesome talk. It’s all written out and well illustrated. Trust me. Awesome.

I’m working on a talk for Reboot, a very fun conference in Copenhagen. Because it’s an after-dinner talk, and because it’s a bunch o’ geeks, I plan on talking in a hugely preliminary way about some of what I’ve been researching for the past few months. I’m assuming the audience’s preemptive forgiveness. Also, with luck, they’ll all be a little drunk. At the moment, my talk is called “Babbage’s Noise,” mainly because I like the way it sounds.

I’m still trying to pick a thread through the morass of material I’ve happily sunk into. The outline I’m currently sewing together — unsuccessfully, so I reserve all rights to ditch everything and talk about Cluetrain or how everything is mixed upsmooshy miscellaneous if I have to — begins by talking about Charles Babbage’s intense irritation about the hurdigurdy players outside his window. Babbage is, of course, routinely pointed to as having in the 1820s invented a precursor to the modern computer, which many say got just about all the elements of the architecture right. Fascinating guy. I then want to compare his use of the term “information” with the modern formulation, which comes from Claude Shannon, but which was quickly transmogrified.

Ultimately, I want to argue that Babbage’s machines had nothing essential to do with information in the sense in which we use the term in the modern age. Babbage thought he was applying Adam Smith’s principle of the division of labor to the production of tables. My talk will spend some time on the history of tables, because I think it’s really interesting. But the main argument against reading the modern idea of information back into history is that modern information is encoded and symbolic, neither of which were true for Babbage’s machines, although I grant that it sure looks that way.

And I think I have an overly-clever way of bringing it back to the modern sense of noise. (Possible spoiler alert, depending on where the talk goes: Communication theory generalizes based on the exceptional case when communication is derailed by noise.)

Here’s a slick video of the new, second Difference Engine, built according to Charles Babbage’s 19481848 plans. The narrative is over-heated, but the visuals are nice.

I’ve been continuing to read about Babbage because I think he provides an interesting way to argue that information didn’t exist before the middle of the 20th century. It’s a mistake to view even Babbage’s more advanced machine (the Analytic Engine) as dealing with information, much less as a computer. But I’m not ready to make that argument yet. I’m having a lot of fun researching it, though.

For the past few months, I’ve been thinking about writing something that argues that the history of information is way more discontinuous than we’ve thought. Usually, we trace info and computers from Turing and Claude Shannon back through Hollerith’s punch cards, back to Babbage, and maybe back to the Jacquard loom, which used punch cards to control the patterns being woven. But I think this reads the modern idea of information back into machines that were not information-based at all. The loom cards look like punch cards, but they’re not really information, any more than a gear is. Or a comb is, for that matter.

When we discovered atomic theory, we were able to claim that historic objects were made of atoms all along. But I don’t think it’s the same with information theory. Reading info back into historical objects feels more like what happened when the universe started to look like clockwork.

This matters to me because I think we’re beginning to emerge from the Information Age. The paradigm is just starting to break. So it’s a good time to wonder how we ever managed to conceive of ourselves and our world as made out of information. How did we become information?

So, I’m not sure how to approach this, but I’ve been having a lot of fun reading about Babbage (including the new Difference Engine construction, as well as Doron Swade’s account of the first one), Hollerith, Turing, Shannon, and the rest of the cast of characters. I’ve also been poking around in some disciplines that reconceived of themselves as being about information, especially genetics. Some great stuff has been written about this. (E.g., “Who Wrote the Book of Life?: A History of the Genetic Code” by Lily Kay) Every conversation leads to another three books, and every book leads to another ten, so I’ve been reading fairly randomly and quite happily at this point. (No, I have not yet read “How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics,” by N. Katherine Hayles, but it sounds spot on.)