1995. AOL, CompuServe, and Prodigy were the on-ramps to the Internet for most people. Geeks and university students surfed around via Telnet, FTP, Gopher and the still-new graphical web browser from the NCSA called Mosaic. Yahoo, Amazon, and Craigslist had just been born; Google was a year away. The Kindle was hard to even imagine.

I toiled in grad school, studying the cultural consequences of the spread of printing technology in England. But I was a computer nerd too and wholly smitten with hypertext, firstly the literary, non-networked variety you installed from floppies and then the new blue-underlined links on the World Wide Web. So I did what an unmarried, basically friendless grad student does on a topic of interest: I spent weeks working on a horribly convoluted, jargon-heavy paper about it.

portrait of the author as a young nerd

The Heresy of Hypertext: Fear and Anxiety in the Late Age of Print was my embrace of electronic text and the nascent web. But that embrace was not really a discussion of the merits of hypertext. It was a critique of the critiques of the new medium — so very closed-loop and academia. In hindsight, a thoughtful analysis of what the Internet revolution might mean would have been way better, but then maybe I was presaging the web phenomenon of trolling the haters.

It’s painful to read, sodden with language from critical theory and words I made up out of whole cloth — “technoclasm”, “intraloquial”, what? — but I think I have managed to pull out the thread of argument. Let’s dissect and evaluate.

My starting position was that the printed word was on its way out. No timeframe, just that the slide was in progress. Probably seemed more radical in 1995 than I knew.

Despite exaggerated reports of its demise, the codex book is not dead — but, like handwriting in the age of print, it isn’t likely to remain the dominant means of textual dissemination. So the question is not if computers will transform our notion of reading and writing, but instead how?

True, in 2015 lots of books still get printed and read and, sure enough, newspapers are still around, but I think it is hard to argue that I got this one wrong. Though it is tough to quantify just how much reading takes place from a screen with the rise of higher-resolution displays, smartphones, and tablets, it seems obvious that the printed word has been toppled from its spot as Most Favored Medium. The question of what is being read or the quality of the print vs. online reading experience are other matters altogether, but I think this point has been proven over the last two decades.

So, on to the various anxieties I sought to shed light on:

Anxiety #1: Unmediated, unsystematized, instant access to information will make people dumb.

[Critics fear] that the non-hierarchical interconnectedness of hypertext represents little more than textual totalitarianism, implicitly proscribing what can and can not by read by the existence of a pre-defined nexus of links.

It is certainly true that total access to information is not the same thing as knowledge, though that has always been the case. Instant access to basically any information has been shown to promote confirmation bias and to enshroud us in filter bubbles — and probably contributes to the aversion to long online reads.

The critics were right to be skeptical. Hypertext-linked information does not make us dumb, but it does seem to amplify some of our human tendencies toward intellectual laziness. I’d argue this does not outweigh the benefits of access to the breadth of information available on the web, but my breathless cheerleading for “radically egalitarian” electronic text seems a bit shortsighted so many years on.

Thankfully, this part holds up:

… the circulation dynamic of texts published on the Internet resembles the medieval and Renaissance practice of glossing, parodying, or otherwise altering a manuscript before passing it along. Slowly, though, the ubiquity and fixity of print have eradicated such practices, all but banishing the notion of a collaborative, “textually permeable” work. Now, the cult of the author and the printing press are inextricably linked; you can’t have one without the other. Digital text, however, requires neither. As a consequence, and much to the chagrin of political critics, no economic model has yet been devised to explain its production and propagation in a capitalist society.

I wasn’t really thinking about journalism at the time, but it is true that we have yet to see a really viable business model for words on the web. The printing press gave birth to more than our modern concepts of authorship and copyright; it created a giant business model that we still haven’t properly modified or turned our backs on.

… both Luddite and hacker agrees [sic — er, sic?] that a difference does exist. But if the same word inscribed on paper and displayed on a computer screen means the same thing — and how could it not? — then the only explanation is that we perceive a discrepancy, that the medium itself somehow affects how we think of the words.

Notwithstanding that the estate of Marshall McLuhan is probably owed royalties on that particular insight, it is obvious to nearly everyone now (as it was then) that there is an experiential difference between screen and paper. Reading on the Kindle most certainly does feel different than reading a book. Research has proven this to be the case with some concrete differences between paper and screen.

If I had an argument back then it was that this perceptual difference was not enough to dismiss electronic text as a medium altogether, which many critics did. I didn’t really back this up with anything, which may explain why I am not a college professor in 2015.

Anxiety #3: Electronic text, composed in graphical environments and often presented on screen intermixed with visual media, demeans the power of the word.

The upshot of critiques in this category is that words should be words, pictures should be pictures, and keep ’em separate. I cited hand-wringing over GUI-based word processing environments where “students who used word processors in the iconic environment of the Apple Macintosh wrote qualitatively and statistically inferior prose compared to students who composed on the text-based interface of IBM’s DOS machines”. That may or may not have been true then, but what’s interesting 20 years later is that, though we live firmly in graphical computer environments today, the pendulum for serious word processing has swung back, essentially acknowledging that visual distraction and excessive document customizability are at odds with serious writing. Programs like Scrivener and Ommwriter and entire window functions like OSX full-screen mode exist to address the problem of focus.

I did chuckle upon re-reading this passage:

[The] fear is not that future writers will revert to pictograms but rather that the traditional modes of textual composition that stress linearity, closure, and containment are being eroded from the inside out by the visually-based compositional aids themselves.

Turns out, the Internet in 2015 loves pictograms. I was being flippant back then, but with the ascendancy of emoji and the rebirth of animated GIFs it is hard to deny that word and picture are equal units of syntax in online writing.😐

The essay really says more about me than it does the Internet. If you can parse the byzantine prose you hear a kid full of excitement about a new medium, ready to make the jump from English grad school to Information Design. It was a thrilling time, though I’m left with nostalgia for a time when the web seemed so full of possibility and uncharted. It’s still a wilderness, but not for the right reasons.

The piece ends with this vignette:

In December of last year [1994], panic struck the Internet when rumor spread that a text-based computer virus was replicating its way across the globe and that it could be acquired simply by reading one’s electronic mail. For a while millions of people refused to approach their messages, afraid that the very act of reading would cause infection. We would do well to learn from this reaction, especially since the idea of a text-based virus was proven to be a practical impossibility; the rumor was a hoax. Indeed the rumor that the new medium of digital textuality will infect or corrupt our print-based practices is as baseless as the hysteria caused by the alleged virus. Human communication, like a living creature, has always adapted to tumultuous periods of change, surviving the “pathogenic” influence of speech, literacy, and moveable type, and no reason exists to think that it won’t adapt to the computer.

Viral infection from online reading was a joke in 1995 and for many years later (though many believed it was possible), but recently a bug was found to be exploitable in the Android operating system that allowed malicious code to take over a device simply by receiving a text message (technically MMS).

There’s still no need for panic; we adapt, as we always have. And that’s the argument of “The Heresy of Hypertext”, minus 3,806 words.

Richard Powers wrote a lengthy non-fiction piece for GQ* late last year that I’ve only just stumbled across.

If you’ve read The Gold Bug Variations this might not surprise you, but the column is about his decision to let the magazine fund the sequencing of his genome — with all the potential bad news that might bring. It’s a tale of wonder and high anxiety.

As I disembark and stroll down the mobbed concourse at O’Hare with my genome in my ﬂight bag, I get a ﬂash of how genes in endless combination, shaped by nothing but natural selection, have propelled life from bacterial automata to big brains, from ﬂint shards and pointed sticks to genomics. The novelty gene, the curiosity gene, the dissatisfaction gene, the problem-solving gene, the constantly recombining genes for restless leg, restless stomach, and restless mind have pushed right to the verge of recasting themselves. For a very long time, we have been moving from scripted characters to the co-authors of our own lives. The personal genome is one more tentative step from fate to agency, from fatalism to risk management. We are determined not to be determined. The code is loose and always has been. For good or ill, there’s never been a bottle that can hold this genie.

Read: The Book of Me

Update: Of course I say non-fiction but I have been burned by Powers before on that. Sometimes he just defies belief. Which is why he is my favorite author, bar none.

* GQ, wow, what an amazing understanding you have of your audience. Splitting the article into 21 separate screens! It’s like you don’t want your readers to get to the end, much less click on your damn ads. Well done. Enjoy 2003.

FTB is a collection of short reviews that explores the connection between the subject of a book and the actual location it was read. Like Rob Gordon’s autobiographical organization of his record collection in High Fidelity, the idea is that reading is not a act sliced off from the context in which it happens. The real world has a way of bleeding into the written world, and vice versa. FTB is a compendium of crossovers.

This year Coudal is releasing the reviews as a bound book. Looks gorgeous, as does the poster and the process behind it. Kudos to Steve and the entire Coudal crew for editing such a cool volume. And for inviting me to contribute.

For map-types, I’ve put together a quick one showing the geographical dispersal of the reader reviews. Some are guesses — for point-to-point air and road travel I used the midway as the location — and others don’t exist at all. Happy trekking!

I’ve been a cataloger of nouns used as verbs since Jeff Spicoli opened my mind to the possibilities with “hey bud, let’s party” in 1982. There’s “google” and “calendar” and “lunch,” but the original utilitarian nounverb has got to be “smurf”.

The children over here are all about Apple TV and they recently found the Smurfs cartoon. The show was way before their time and slightly after mine (I’m a product of The Superfriends and Scooby-Doo, thank you very much) but it captivated them. So last night, post-concert, home alone, I watched a few episodes. Like a chill-out room at a rave.

Here is a lexicographical analysis of a single episode, called “The Smurf’s Apprentice,” for the varied uses of the word “smurf”. You can, in fact, get the gist of the whole show just from these lines.

“Watch where you’re Smurfing!”
“Smurfic acid”
“A half-Smurf of burnt siena”
“Three Smurfs of sarsaparilla leaves”
“Now to see if the experiment Smurfs”
“I’ve finally Smurfed spontaneous germination”
“I have a real gift for Smurfin’ magic”
“I want to Smurf magic right now”
“Now’s the time to Smurf a look at his magic book”
“What are you Smurfing in here?”
“A Smurf of quicklime”
“Smurf over a low flame”
“A monster! Smurf for your lives!”
“I Smurfed a magic potion”
“We’ll have to Smurf something else”
“You mean i’ll be this way for the rest of my whole Smurf?”
“I’ll find an Anti-smurf and i’ll call you”
“Smurf it, Grouchy!”
“Smurf’s-eye!”
“I need volunteers to Smurf me three hairs from a cat’s tail”
“We’ll all Smurf you a hand”
“The rest of us will start making the anti-Smurf”

Gerunds, exclamations, even an adjective in there. Does English have a word this useful, a Swiss Army word? I suppose certain expletives might work. Go ahead, swap out “Smurf” for “fuck.” Doesn’t quite work, but will make you smile.

I’ve been working with the Forbidden City in Beijing for the last three years. It’s been rewarding in many ways (and hopefully will reward you come June), but maybe my favorite thing is how things are named there. I’m not talking about poor English signage, though that always gets a chuckle. I’m referring to the historical practice of naming buildings in a style that is both humorously literal and exaggeratedly fantastic.

Consider the following.

Pavilion of the Three Friends – OK, but which three? You’re immediately interested in knowing what this place is about, aren’t you? The best kind of place name for a museum of similar-looking buildings.

Hall of Mental Cultivation – A center of learning, right? You got mentally cultivated just reading the name of the place.

Palace of Tranquil Longevity – Definitely a better name than Del Webb’s Sun City. When I get old and crotchety, please put me in a palace of tranquil longevity.

But those are just for beginners. Ratcheting it up a notch we have …

Palace for the Establishment of Happiness – In reality, part of Qianlong’s palace-within-a-palace retirement complex. In your mind, so many naughty things.

Last night, as my wife was pulling into the garage with my kids, a friend who had come home with them looked at my workbench tools and said:

“I didn’t know your dad made stuff.”

Nathan said, “Yeah, he does.”

“Does he make toys?”

“No, my dad makes music. Like mixtures. For his friends.”

Well said! Keep that up and you’ll inherit everything.

On the other side of the spectrum comes this trenchant insight from a good ol’ boy down on the Texas coast. I overheard him explain:

“Y’see, hurricanes are kinda like NASCAR wrecks. If you see a car crash ahead of you you gotta head towards it, ’cause it ain’t gonna be there when you get to it. Same with hurricanes. I always drive straight towards ’em.”

Now, I’m no meteorologist (much less a racecar driver), but something about this analogy fails to convince. Though it does have a pleasingly Darwin Awards flavor to it. Yes, Bubba, you drive straight for that storm.

And lastly, Larry the fishing guide (who you mayrecall) joined us last week for our annual menfolk fishing expedition. As we were casting the little “piggy” perch on our lines he explained that as soon as they hit the water you had to jerk real fast. This seemed odd until he explained:

“You gotta piss off the bait. Make it mad, so it does what you want it to do.”

Yes, you must bitch-slap your bait so that it makes croaking noises that attract other fish. I thought this was rather brilliant, but it still seemed odd. I mean, wouldn’t you think that hurling the bait through the air 50 feet before it smacks down on to the water would sufficiently piss it off? Still, a lovely quote, especially if you say it with an immense, syrupy drawl.

LEGO is making products so amazing they have to censor how much their customers rave about them.

From a site-based e-mail to my wife that began “Holy shit!” and ended with some half-hearted rationale that we needed to buy this for our son for Christmas. (At least I’m self-consciously transparent, you know?)

No story exists as an island these days. Books beget movies and vice versa. Sequels, prequels, and tangentially-related storylines are published and consumed. Graphic novels, anime, television series, and videogames flesh out the rest of the universe.

This is all market-driven, which is why it isn’t new. But the web and the low barriers to user-created content have sent a small exploded moon’s worth of fictional ephemera* into orbit around popular stories. Alternate realities, fan fiction, 3D worlds, even amateur video series fill in any remaining gaps. Narrative today abhors a vacuum.

This is exciting, though it has been mostly theoretical for me. I mean, I know it is out there, but I rarely encounter it. As with so many things, it takes the perspective of a child to really make clear how powerful an idea can be. My six-year-old son is a huge Star Wars junky. He can’t get enough. He’s seen all six movies, both Clone Wars animated series, has dozens of books, has thoroughly mastered LEGO Star Wars I and II, and consumes any other info he comes across. Wookieepedia has changed his life.

Here’s the thing. My son knows that Star Wars isn’t real. He really does. But he also believes that it is a complete fictional universe. The movies? Oh, well, they’re good, but in his opinion they are just slivers of the stories in this galaxy (from A Long Time Ago) that someone happened to film. The movies don’t have any real precedence over detailed articles in Wookieepedia about, say, the massacre at The Battle of Rodia, the fallible Jedi Set Harth, or the renowned Sullustan journalist Den Dhur. No, I hadn’t heard of any of these either.

Clearly there are limits to this sense of completeness. My son will ask a question about a planetoid or something that none of the games, videos, or wikis can answer. But in his mind it isn’t that the fact or storyline doesn’t exist. It is that it has not been found yet. And isn’t this how we think in the age of The Google? That wanting to know something is more a matter of locating it than wondering whether it exists to be known?

The Star Wars universe is Borges’ Library of Babel and my son is lost in the stacks. Happily so.

Of course, you’ll argue, the best fiction deliberately leaves things out, opens a space for the imagination. One could no more know everything about a given fictional world than one could know everything about real life.

Well, I think of Lost. The world of that island is meticulously crafted; half of every show is backstory. But obviously there are massive gaps in the storyline. This annoys lots of viewers, but it is also what keeps people coming back and, of course, is precisely what enables the universe to expand, whether by ABC scriptwriters (alternate reality game, “official” in-world websites) or by fans (an archive of over 3500 fan-created videos, a dedicated wiki ).

Soon I’m sure my son will arrive at where the sidewalk ends. Some Star Wars story path he’s on will hit a dead end. He’ll confront an incomplete world and will be required to suspend a new kind of disbelief. But if he’s anything like me this will also be the moment when he realizes that creating is even more fun than finding.

[*] Odin Soli has called these overlapping stories “fictional ecospheres.” I like that.

What makes this tribe special is that their language defies a major linguistic theory, championed by Noam Chomsky, that posits a “universal grammar” embedded in the human brain that explains the structural similarity between every language on Earth. In a nutshell, this theory claims that all languages are shaped by a unique biologically-based human ability to create recursive thought structures. What’s that? Well, basically, it is inserting one thought into another. Such as the combination of “A dog is barking” and “The dog has fleas” into “The dog which has fleas is barking.”

The Pirahã don’t do this. Indeed, it appears that they can’t. They simply do not think this way because, in essence, recursion is based on abstraction and the Pirahã do not deal with abstraction.

… the Pirahã perceive reality solely according to what exists within the boundaries of their direct experience — which Everett defined as anything that they can see and hear, or that someone living has seen and heard. “When someone walks around a bend in the river, the Pirahã say that the person has not simply gone away but xibipío — ‘gone out of experience,’” Everett said. “They use the same phrase when a candle flame flickers. The light ‘goes in and out of experience.’”

This has set the linguistic world, much of which subscribes to the Chomskyan belief in an innate, biological basis for the structure of language, on its ear.

I’ve had a passing interest in linguistics since grad school and I bounce around the periphery of ongoing debate, but what really interested me about this piece is how much it reminded me of a comment my son made last year, which I wrote about:

Recently as we passed some strangers on the street he asked “What happens to people when you don’t see them anymore?” He was hovering around asking whether they ceased to exist, though he never actually said so. We explained that they kept on living their own lives and that we’d probably never see them again. This saddened him a bit, though only slightly less that it puzzled him. I think he’s only just realizing that the sum of human experience is a superset of his own.

That got me thinking about the “universal grammar” concept. Maybe abstraction is not biologically-based but learned. But it went further:

… he’s even more obsessed with names. He simply cannot understand how there can be things that do not have names. He constantly asks about how something can exist if it doesn’t have a name. I explain that there are thousands (millions?) of species of animals, mostly small critters, that we suspect exist but have not been discovered and so have not named. Not to mention undiscovered stars, comets, planets and new concepts, future fashion trends, and dance moves…. Like Adam naming stuff in Eden, the power to name is the power to make real for my boy.

So, there it is. One researcher in the Amazon jungle and one little boy in Chicago, both defying the reigning theory on the origin of language. Perhaps my son has Pirahã blood in his veins (though his genography says no).

Mr. Romer’s answer is to do with this moment what Burning Man does every summer: Stake out the street grid; separate public from private space; and leave room for what’s to come. Then let the free market take over."

Cross-posted at Medium. Years ago, when I was just moving into the world of urban technology, I stumbled upon the urban “walkshop” format developed by Adam Greenfield and Nurri Kim (refined and expanded by Mayo Nissen). These walking tours were collaborative, lightly-structured investigations of surveillance and communications machinery sprinkled throughout various cities. Though these walkshops […]

Cross-posted at Medium. CityFi partners John Tolva and Story Bellows frequently find themselves on the frontier of urban change. Their facilitation of major projects often convenes public-private stakeholders, utilizes new models of technology and innovation, and drives policy development, all with the goal of making our cities more livable, vibrant and economically competitive. Currently, this […]