Site Search Navigation

Site Navigation

Site Mobile Navigation

This Is Your Brain on Twitter

By Nick Bilton May 18, 2011 3:48 pmMay 18, 2011 3:48 pm

5:47 p.m. | Updated Adding response from Bill Keller at the end.

In his latest column for The New York Times Magazine, Bill Keller, The Times’s executive editor, likens clearing the way for his 13-year-old daughter to join Facebook to handing her “a pipe of crystal meth.”

I can’t say I have ever tried crystal meth, but I do visit social networks on a regular basis. Twitter, which Mr. Keller says he believes could make us “stupid,” has become an irreplaceable part of my daily life; it augments how I report stories, socialize with friends and share and consume everything from store coupons to breaking news.

Before I embraced the social flow of information on the Web, the bulk of my news came from the printed newspapers and magazines that arrived on my doorstep. Now this news is sliced and diced by my social circle, with the most important pieces of content from around the Web being presented to me like a neatly wrapped present each time I check my feed.

Another concern of Mr. Keller’s is the prediction by many that the smartphones we carry in our pockets will soon evolve into wearable computers — essentially turning us into cyborgs.

This future could not arrive soon enough. The smartphone that we have all grown completely dependent upon has become one of the rudest technologies ever invented. It harasses us when we have a new e-mail, text message or social network update. Technologies that are wearable and more aware of their surroundings, and therefore able to tell when it is O.K. to interrupt us, will let us wander the halls of society without our gaze turned downward and two thumbs clacking away on a mini-keyboard.

There is a fear by many, Mr. Keller included, that these devices will wipe out our ability to remember and force us to become dependent on the virtual world. Luckily for us humans, our brains do not work this way. Research shows that the human brain is capable of adapting to new technologies in less than a week, irrelevant of age or intellect.

As I’ve written in the past, Maryanne Wolf, the director of the Center for Reading and Language Research at Tufts, points out that our brains were never even designed to read. This “technology” is something that we have to train our brains to do.

In the same way that we hack our brains to read, we are not going to flush away our powers of memory by adopting tomorrow’s technologies.

To return to the “pipe of crystal meth” analogy: If I had read Mr. Keller’s censure of Twitter and Company a year ago, I would have vehemently disagreed with almost all of his views. But today, after using these products voraciously for some time, I do see their detriments. When used to excess, these technologies that so easily connect us to people far away concurrently disconnect us from people who may be directly in front of us.

But I believe “excess” is the key word here. Used as a crutch for our relationships and memories, these tools and technologies could indeed spawn a flawed generation of people peering into their phones without regard for the world around them.

Now, has anyone seen my cyborg glasses?

Update: Here is a response to this post from Bill Keller.

Nick,

I admire your expertise and applaud you for having the fortitude to challenge your less digitally adept boss.

You’re fired.

No, just kidding. As always, you are worth reading on any subject related to the intersection of technology and our lives. To continue the conversation a bit:

First, I wish you had made clear (as the Twitter torrent mostly does not) that I praise Twitter and Facebook rather effusively. My column does not advocate that Twitter be censured, censored, abandoned or ignored, even if any of those things were feasible. Twitter and Facebook are ingenious devices, and they happen to be wonderful tools for disseminating (and, up to a point, helping to create) great journalism, about which I care mightily. All of that I said emphatically in the column.

Second, my point is not that Twitter makes us stupid. That was a hashtag I tweeted, a premise followed by an invitation: “discuss.” I sent it to demonstrate that for all the things Twitter is, it’s not a very good venue for intelligent discussion. And if the response to my “Twitter makes you stupid” provocation didn’t convince you that Twitter is ill-suited to real discussion, the tweeters’ response to my column should. I am, of course, delighted that so many people have sent it on, and humbly surprised at how many actually agreed with it. And I always feel a spurt of delight when somebody uses this haiku-sized format to produce something clever, whether it’s in my favor or not. But I don’t think anyone would hold up this stream of tweets as a proud example of an enlightening colloquy. In fact, many of the reactors seem either not to have read what I wrote, or to have read it with threadbare attention. A fair number were more concerned about the fact that my column didn’t link to a couple of tweets (which I had quoted in their entirety) than with the substance. The scary part is how fierce and faith-based some of the reaction has been. I mean, lighten up. This isn’t your religion we’re talking about. Or is it?

What I said in the column is that we pay a price for progress, and we should pay it wittingly rather than have it siphoned secretly from our bank account. And we should consider whether the price is worth it. If Facebook is displacing real friendship, if Twitter is diminishing actual conversation, then maybe that’s a good reason to limit how much of your life they consume.

As for replacing my phone with something that rings inside my cerebral cortex — well, you’ll make a better cyborg than I will.