Tuesday, December 15, 2009

It’s hard for me to believe that anyone — anyone — would think it a good idea to project a giant stream of Twitter commentary on a speech while the speaker is giving it — but that’s what they do at the big Web 2.0 conference, with predictably disastrous results for Danah Boyd. Note the comments by Kathy Sierra, who has been on the receiving end of some nasty commentary herself. And see some further reflections on this ludicrous practice here and here and here.

Let me make an observation that these other observers are, it seems, reluctant to make. That actual multitasking is cognitively impossible has been established beyond reasonable doubt: see Christine Rosen’s article in this very journal, or, if you prefer to look beyond our house organ, try here and here. In fact, it has become clear that the people who think they are skilled multitaskers actually are worse at it than other people.

So when you set up a Twitter stream to project as a speaker is speaking, and invite people to participate in it, you are simply asking them to fail, miserably, to understand what the speaker is saying. If a speaker makes a point that you find dubious, are you going to wait to see if later stages in the argument clarify that point, or perhaps make it more plausible? You are not. You are going to tweet your immediate reaction and therefore simply miss the next stage in the speaker’s argument. Every tweet you write, and every tweet you read on the big screen, compromises still further your comprehension of the lecture. I bet that after the talk was over there weren’t a dozen people in that audience who could have given even a minimally competent summary of what Boyd said.

Boyd understands all this: “Had I known about the Twitter stream, I would've given a more pop-y talk that would've bored anyone who has heard me speak before and provided maybe 3-4 nuggets of information for folks to chew on. It would've been funny and quotable but it wouldn't have been content-wise memorable.”

That is, she would have given a talk that did not make a sequential argument but just strung together sound-bites, because the audience couldn't have grasped anything other than disconnected aphoristic statements. In other words, she would have given a talk made of tweets, because that’s all that her tweeting audience could possibly have received. And even then they would have gotten only some of her verbal tweetery.

(Incidentally, or maybe not incidentally, there are certain ironies involved in Boyd being the one to complain about this situation.)

So what the people at Web 2.0 are saying to their speakers, loudly and clearly, is this: We don't want sequential reasoning. We don't want ideas that build on other ideas. We don't want arguments. Just stand up there and fire off a series of unsubstantiated claims that have no connection to one another. Preferably 140 characters at a time.

25
comments:

I experienced this at the An Event Apart Web conference and saw both negative (20%) and positive (80%) implications. Maybe Apart attendees are kinder, the majority of the tweeks being restatements of the speakers points, or quotes. I will admit (90%) that it was distracting and would have been better to ignore the feed an concentrate on the talk. That said, a couple of the talks (10%) were so boring that the feed was actually more interesting, though probably not helpful.

A good reason to hate this kind of backchanneling is the same one we hate people who talk in movies for (at home or the theatre), but the potential evil of Twitter feeds is the silent, semi-anonymous, unaccountable nature of them.

I don't grok Twitter in general, and am fond of this quote: “What can be said in 140 characters is either trivial or abridged; in the first case it would be better not to say it at all, and in the second case it would be better to give it the space it deserves.”

There are a few useful uses I can imagine for Twitter -- protesters, street vendors -- but they're a tiny proportion of the total.

While I've never live-tweeted during an event, nor do I plan to, I disagree with the premise here. It seems you're saying one cannot pay attention to a speaker and take notes (or post tweets) at the same time. I went through college and grad school taking copious notes on my computer, and don't think my learning or interaction with the material was compromised much (if at all). In fact, even for those who still take notes with pencil and paper (or papyrus, for that matter) are required to disengage temporarily while they take notes. Are they incapable of summarizing a talk or argument, too? I think not.

Anon, it's not so much what I'm saying, but what an ever-increasing body of research is saying. And it fits with experience: I can't tell you how many times over the years I've had students ask me to repeat a point that they missed because they were taking a note on my previous point. Some of these trade-offs are inevitable — people often need to have notes to refer to later, so I don't mind repeating points in that situation — but since no one is actually capable of multi-tasking, every "temporary disengagement" has a price. What live-tweeting at conferences does is to dramatically increase the occasions of disengagement: if in between composing your own tweets you're also reading other people's tweets on the screen behind the speaker, and probably the tweets that are coming to your screen from the people you follow, how much attention do you have left over for the speaker?

What makes all this interesting — it now occurs to me, thanks to Anon's comment — is that there's a kind of sweet spot where close attention and reflection come into balance, and it's hard to find. I don't presume to judge his note-taking habits, but most students who take notes on their computer are intellectually checked out of the class session — they're just taking dictation, basically — and have a tough time using those notes productively later on. (There's a body of research on this too.) I like the old Cornell note-taking system because it seems to aim pretty well for that sweet spot.

But let's not hastily blame Twitter itself just because of this admittedly egregious use of it. The challenge of new media is not how to dismiss it--that's too easy--but rather to figure out how to use it appropriately and well. This was true of the printing press, and it is true of the internet.

Literary history is rich with forms that would easily fit within the 140 word limit. Haiku. Epigrams. Aphorisms. (Francis Bacon, whose work inspired the title of this journal, particularly liked the aphorism.)

We shouldn't blame the epigramist for not writing an epic. We might blame her if she attempts to write an entire epic in epigram form, and if she believes that she has succeeded. But the problem in those cases lies with the author, not the medium.

As far as multitasking goes, it's true that no human being can really do two things at once, but most human beings get very good at switching between multiple tasks.

Any professor who offers substantial feedback on a student paper is alternating between different reading, thinking, and writing skills in rapid succession in the space of a single page--and few, if any, of those comments in the margin exceed 140 words.

Chad, has anyone here criticized Twitter itself? (I'm over 3,000 tweets myself, so I don't think I'm in a good position to do so.)

Also, I'm not sure how the example of grading papers fits here — my post was about the difficulty of writing and listening at the same time. When I pause to make a comment in the margin of a student paper, or a book or article, the text doesn't continue talking as I write.

I've found projected livetweets useful during discussion sessions, if the session is well-managed. Trying it during a presentation seems a recipe for disaster, though.

Also, a lot of the tweets in my stream, and of those I follow, are short summaries or comments on a longer piece, with a shortened-URL link to the piece. Works fairly well as a current-awareness service.

While I can understand why a conference focused on interactivity would consider including a Tweet wall, I'm surprised that they didn't rule it out. It's too bad that organizers -- the people who are supposed to think things out in a premeditated sort of way -- wouldn't conclude that a Tweet wall provides an entirely unnecessary layer of interactivity. Questions at the end of a conference usually suffice, and a video or audio recording of a presentation is far more useful than a bunch of Tweets.

The real punchline, though, is that even though Twitter is a novelty, compared to something as simple as a spoken and recorded question, Twitter is compartively low-fidelity. A white board would actually be a step up.

Another quick thought: Why introduce a commentary function that has essentially no accountability or ability to limit the number/redundancy of comments, which are arguably the worst aspects of Web dialogue. It's entirely unfair to the speaker, who is physically in the same room with people discussing her anonymously (or who have the power to), which would otherwise be impossible.

I think much of this could be solved by emulating the traditional Q&A portion: pre-pend the full name and photograph of the individual posting to the feed, moderate it, and give the tweeters opportunity to expand on their comments after the talk.

The small # of people I follow on twitter use a couple of tactics to counter this. Either they provide a headline and a shortened link to a long article, or they hang back for a bit and then post several sentences in rapid succession in order to write a longer-form piece. (Followed easily if you click on their feed.)

(I've mostly used it to follow events in Iran, and most of the people I follow are either journalists (Iranian & Western) or sophisticated Iranian citizens or expats. Some get carried away and post so much I can't find anybody else's posts, and when that happens I delete them and just click on them when one of the more restrained posters cite them.)

Surely note-taking can focus attention and structure information in a useful way, but that's an inner-directed, attentive activity. Tweeting is just the opposite: it is projecting oneself, performing, essentially. Indeed, I think most of the expression constitutes web2.0 - blogs, tweets, comments - is best thought of as performance; phatic information that serves, primarily, a social function (ie. "social media"). As Christine Rosen has pointed out in her article "People of the Screen," the reason a non-interactive novel cramps the style of the web2.0 mindset is "that you must first submit yourself to the process of reading it—which means accepting, at some level, the author’s authority to tell you the story." That kind of submission to authority is not compatible with performance, and neither is submitting to the authority of "the sage on the stage." Other than Rosen and Lee Siegel's "Against the Machine," I haven't seen much on the performative nature of web2.0, but would be interested to hear of other sources?

Actually I'm sorry for Danah Boyd but her whole post seems to me to be mostly a post-fiasco excercise in making excuses for having a bad day. (And of course, I'm sorry people were crass enough to stoop to sexual innuendoes.) Alan, I've seen this done in lots of conferences and it seems to work fine - it even seems to add to the excitement. Think about this - would Boyd have written an entire post on this practice if her talk had gone well? Hell, she wouldn't even have *noticed* it!

what you say about Boyd is probably true, scritic, but the larger point remains: even if the Twitter stream "adds to the excitement," it makes comprehension of sequential thought impossible. Big price to pay for excitement.

Consider what happened the following day during the keynotes. Speakers adapted to the additional channel of information. Some adapted to the potential for dialog with attendees instead of a traditional information monolog.

"Some tweets contained simple status information such as "The presentation is starting now." Commonly, audience members posted quotes from the presenter. Other tweets contained supplemental information such as 'The link for more information is ...' Occasionally, the tweets were fact checks. A few tweets contained a surprising insight or humorous anecdote." From: http://www.blog.oplaunch.com/product_launch/2009/12/adding-social-media-to-new-product-development-efforts.html

By the second day of presentations, a Don't Feed the Trolls (DNFTT) capability was activated. The feed generated from #w2e (the publicized Twitter hashtag) was moderated. There was a mechanism to remove inappropriate comments before they were displayed on the big screen.

Tweets are not anonymous. Authorship is revealed. The information stream is reviewable during and after the event.

I found the Tweets provided a richer experience and facilitated engagement in the conversation.

Live tweeting is very useful to cover AGM's or meetings where voting and democratic decisions are being made. They allow for those very interested in the event to cover the decisions and atmosphere of the meeting quickly than no other system can.

Two months ago at the highly contentious Canadian Federation of Students Annual General Meeting, which is dealing with some severe internal problems, the twitter universe allowed activists to get information about contentious matters of the AGM out in the open: http://thevarsity.ca/articles/23679

At one point on the #cfs09 feed when a highly contentious motion was about to be voted on--there were about 15 tweets a second and the fire alarm even got pulled. It was an exciting time for students across the country watching the twitter feed--and certainly made the whole thing more democratic for those who couldn't attend.

I wouldn't completely shun liveblogging, and completely through the baby out with the bathwater. It just needs to be done right, which, in the case of putting up twitter feeds behind speakers, it wasn't.

Post a Comment

Search This Blog

About

Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?