"I do not care that these fun little books exist, but that they are dominating the public conversation. ... In the meantime, I think the two essays linked to above ["Is There a New Geek Anti-Intellectualism?," by Larry Sanger; "The Internet Intellectual," by Evengy Morozov] are an important pairing to start a conversation over who gets to frame how new technologies are understood. Will it be a-historical, a-theoretical, non-rigorous business folks or can we inspire a new wave of technology-centered public intellectuals?"posted by MonkeyToes at 5:10 AM on July 22, 2012

Blargh. Yvgeny Morozov has made a career of being an "iconoclast" by way of being an asshole, and Jarvis, while he has similarly made a career out of being the loudest person in the room, can at least process criticism.

Arguing over whether these folks are the best people to produce trenchant criticism rather than producing the criticism you'd like to see seems like a fool's game.posted by to sir with millipedes at 5:13 AM on July 22, 2012 [2 favorites]

How is Yvgeny Morozov an asshole: as far as I can tell, he's inject a healthy dose of scepticism into the breathless discussion of the Internet and its future by the likes of Jarvis and Shirky. These guys spend their lives making extraordinary claims that frequently go unchallenged, and Morozov is doing a great public service by questioning them without falling into the trap of being anti-Internet himself. More strength to him.posted by axon at 5:26 AM on July 22, 2012 [10 favorites]

Jarvis is perhaps better viewed as a populist polemicist, which explains a lot of the disconnect here.posted by jaduncan at 6:13 AM on July 22, 2012 [1 favorite]

There is an awful lot of value in explaining things clearly, value that's often lost on academia in the humanities, including fields studying the role of technology. You need not distill your message down into tweets to make it comprehensible.

I think that there are a whole range of issues here, and the fact that the comments on that article explode in a multiplicity of different directions at once is indicative of the fact that there are complex issues at stake here about both the production and consumption of knowledge, the role and status of 'the intellectual' in contemporary culture, the form of the publishing industry's mechanisms of dissemination, and whether or not we live in an era where overt, sleeve-worn, expertise is viewed with suspicion. I have little knowledge about the specifics of the issue with regards to technology and geek-culture (other than to say I despair when debates like this gradually devolve and ossify into whether the internet/technology/ourgreatbignow is A Good Thing or A Bad Thing), but I think there are really important systemic structures that sit behind some of these problems. Generally, there seems to be an increasing reluctance on the part of those people who work in academia to go into bat on issues of genuine contemporary public interest. This is not to say that only those people who work within the academy are sufficiently equipped to speak upon these matters: christ only knows that's not the case, but, as Jurgenson points out, their absence clears the field for the public discourse to be effectively monopolised by punchy, soundbite-heavy, t-shirt-of-the-week drive-by thinking. The conversation needs that, but it also really needs balance, and this case, that balance can (and should) be provided by those who might have a longer view, be driven to identify underlying trends or complex structures, be willing to propagate unpopular or divisive thinking, and be capable of drilling down into an issue and understand its contextual implications. Academics are, generally speaking by virtue of their training, the ones we might reasonably expect as members of the public to be doing this sort of intellectual work for us and then disseminating it to wider audiences. In many arenas of our national lives they do indeed do this work, and in others, not so much (or not many of them do, at least). And here's the problem: for those that don't, who the fuck can blame them? Many live in a world which only very rarely incentivises joining wider public discourse; indeed, frequently such a thing is actively discouraged. Professionally, their esteem markers consist of engaging in conversations limited to their own immediate (and in some cases, extremely limited) subject circles, and the time taken to try to simply communicate complex ideas to millions, which would gain them absolutely no professional prestige whatsoever, will eat into the time needed to communicate similar ideas to a handful of colleagues, wherein lies career advancement. Personally, they may fear scorn (or even simultaneous jealousy) if they do try to speak about issues and to audiences that sit outside the academy. They would only damage themselves if they did this. So we end up at an impasse where we need a different kind of thinking about the issues that touch all our lives (be they internet privacy, or our ongoing economic clusterfuck), a phalanx of minds trained to do that, and an ingrained professional structure that discourages those public needs from being met.posted by hydatius at 6:32 AM on July 22, 2012 [7 favorites]

How is Yvgeny Morozov an asshole.

Well, in the article in question it doesn't take two paragraphs for him to wheel out a sneering sentence that opens with the word "one"; "One cannot fault Kirk for thinking too small."

Perhaps a shallow reading on my part but the unironic use of the "one's self" construction is a very reliable warning sign of incipient literary douchebaggery. And on that front, my goodness, Morozov delivers: the next four paragraphs constructing a cartoonishly unflattering image of the man whose work he's critiquing without even fairly citing, much less addressing, any argument Jarvis may have made.

He goes on to explain that the most glaring flaws in Jarvis' arguments are that Jarvis is not at all like Morozov, and that's about when the article starts circling the drain in earnest, using increasingly baroque terminology to make increasingly simplistic (and frequently nonsensical) assertions.

And Jurgenson falls for it. He admits extensively ad-hominem nature of Mozorov's article, but concludes that "Morozov’s dismantling of Jarvis picks up when he quickly moves into attacking the ideas contained in the book", blessing the article as a "necessary critique" without realizing that Mozorov gives precisely the same treatment to Jarvis' arguments that he gave Jarvis himself.

As he goes on to ask "Where is the Marshall McLuhan of social media?", I have to wonder if he realizes what he's asking for. Television had been around for a long time before McLuhan had anything to say about it, and social media is pretty new to the scene. But beyond that, why would he go looking for a strong commenter about this new, pervasively networked space, among people who cling desperately to their own iconoclasm?

It seems like a lot of trouble, to find somebody floating alone in the Atlantic, clinging white-knuckled to their lifevest, so you can ask them about the finer points of shipbuilding and navigation.posted by mhoye at 6:35 AM on July 22, 2012 [5 favorites]

From the Larry Sanger essay:

The more that a person really takes seriously that there is no point in reading the classics, the less likely he’ll actually take a class in Greek history or early modern philosophy.

I think that a big reason why people don't believe in reading the classics is because they're still routinely sold as items of intrinsic value, rather than, e.g., things you study to gain context with which to understand the world you live in.

They certainly have intrinsic value for some people, and those people tend to become classics professors. But the section of the population for whom studying history is fun and interesting, irrespective of why you're reading it, has always been very small.

In the past, geeks have often taken it anyway because it was the only way to get at the learning they really wanted. Then they got the benefits of context from what they'd learned, and so they gained appreciation of it after the fact. That's not happening so much anymore, because of how easy it is to bypass the institution and study what you like.

I think there's a lot of value to be had in learning-for-context, even in cases where most people won't get any value from learning-for-content. But I can only contemplate that idea because my father went to great pains to explain it to me when I was younger and more contemptuous. Most parents don't.

It is not actually a very hard idea to understand if you're a computer programmer. You probably deal with graphics contexts all the time, and if you deal with the wrong one your users won't even be able to see your program in operation.

But most geeks don't try, because they were made happy and successful by an approach that ignored all that context in favor of the content that interested them. And so, however easy it would be to reconsider, they don't try.

Indeed, given the concerns I've sometimes expressed about the "cult of irrelevance" in academe, I've come to believe that blogging ought to be actively encouraged in the academic world. I'm not saying that all political scientists, historians, or economists ought to start their own blogs, but we shouldn't penalize scholars who do engage in this activity and we might even consider rewarding it, the same way we should reward scholars who care enough about public service to use their talents and training working in the public or NGO sector. It would be good for the IR field if academic scholars were expected to write a few blog posts every now and then, if only for the purpose of self-examination. If the typical academic had to write a blog for two weeks, they might discover they had nothing to say to their fellow citizens, couldn't say it clearly, or that nobody cared. That experience might even lead a few of my fellow academics to scratch their heads and ask if they were investing their research time appropriately, which would be all to the good.Stephen Walt, Foreign Policy Jan 2010

IF THERE is any endeavour whose fruits should be freely available, that endeavour is surely publicly financed science. Morally, taxpayers who wish to should be able to read about it without further expense. And science advances through cross-fertilisation between projects. Barriers to that exchange slow it down.

There is a widespread feeling that the journal publishers who have mediated this exchange for the past century or more are becoming an impediment to it. One of the latest converts is the British government. On July 16th it announced that, from 2013, the results of taxpayer-financed research would be available, free and online, for anyone to read and redistribute.The Economist, July 21st 2012posted by infini at 6:48 AM on July 22, 2012 [7 favorites]

Interesting that you choose to make that comment in the form of a footnote.

Unless those you disagree with have defended parenthetical comment while agreeing with your evaluation of the quality of writing, you're presenting them with a question begging argument: "footnotes and other commentary are bad style because they're bad writing".

No piece of writing suffers through the application of intellectual rigour. The fact that a field has stylistic norms that differ from those of your own doesn't mean that the people working in that field don't appreciate or understand the importance of clarity. Complex ideas require nuanced expression in order to be expressed clearly. Simplicity and clarity are concepts as distinct in writing as those of accuracy and precision in science.posted by howfar at 7:08 AM on July 22, 2012 [12 favorites]

Well, that's absurd. The humanities haven't always valued clarity enough - in contrast to the STEM fields, where they often just seem incapable of it - but footnotes and parenthetical remarks have nothing to do with that. Careful writers can use them effectively, just like anything else.posted by two or three cars parked under the stars at 7:08 AM on July 22, 2012 [13 favorites]

That Yvgeny Morozov article seems rather mean-spirited. Intellectuals trying to convince readers that other intellectuals aren't intellectuals doesn't really get you anywhere. Like in politics, I wish they would stick to the issues.posted by demiurge at 7:48 AM on July 22, 2012

He goes on to explain that the most glaring flaws in Jarvis' arguments are that Jarvis is not at all like Morozov, and that's about when the article starts circling the drain in earnest, using increasingly baroque terminology to make increasingly simplistic (and frequently nonsensical) assertions.

I have re-read the article after reading your comment to better understand your point. Could you point out which terms Morozov uses that are "baroque"? I think his use if the word "perplexity" might perhaps qualify, in the final couple paragraphs. Other than that, as far as I can tell, the essay has a fairly simple structure --- after about a page or so where Morozov draws a sneering analogy between Jarvis and the fatuos protagonist of the Bradbury nicel, he then spends the next 5,000 words or so briefly quoting or paraphrasing one of Jarvis' ideas and then running down a list of obvious counter-examples that he says, Jarvis doesn't touch on. If Morozov is wrong about his counter-examples, then he would be being quite unfair to Jarvis. But if he is correct, if Jarvis doesn't touch on any of the obvious stuff he brings up, then he seems to me to have made quite a strong case for the shallowness of Jarvis' book.posted by Diablevert at 7:52 AM on July 22, 2012 [1 favorite]

Morozov's essay seems spot on, and he picks up on a sound-bite focused style of writing which has become popular in "intellectual" books recently. Probably not due to Twitter, as the standard argument goes, but rather a general lack of attention span. I think of it in relation to the (multi-leveled!) news crawls at the bottom of news channels:

This is pointing up something very important. In just the last few weeks I've encountered a few expressions of this discussion - the need for an independent and critical intellectual response to contemporary cultural issues - and the rustling alone is encouraging.

W/r/t internet culture, as an erstwhile journalist I have been concerned since the earliest days about the HURF DURF OLD MEDIA unthinking slapdown of news organizations. Now that we're reaching a seriously constrained time in which a few media conglomerates dominate reporting, and most other news sources crib from a very few news production outlets, we're finally starting to hear some "uh oh, did we mean to do that?" hesitancy on the part of the crowd that at one point naively believe information wanted to be free, and that people wanted to create it for free. This was never true for vital civic information, which a lot of people are invested in not being free and which is costly, sometimes dangerous, skill-demanding and time-consuming to generate. That revolutionary rhetoric has been fairly costly.

Hang on. We're having a discussion about Evgeny Morozov and the difference between intellectualism and cheap point-scoring — and Morozov is on the intellectual side of the comparison?posted by RogerB at 8:12 AM on July 22, 2012 [2 favorites]

The battle seems to be between two types of intellectuals: Those whose success is worked out by their standing in academic circles and their proximity to tenure, and those to whom success is the ability to get high paying consulting gigs at media corporations. Meanwhile, everything's changing so fast that hardly anybody can say anything demonstrable about the state of the internet, where it's going or what we should do about it. I've found sometimes that the wisest things said about the flux are in well educated but semi-anonymous comments on blogs and metafilters, where the only stakes are: I have seen it, and I believe it to be thus, and so I share it with you. That's the purest form of intellectualism if you ask me.

Morozov's essay seems spot on, and he picks up on a sound-bite focused style of writing which has become popular in "intellectual" books recently. Probably not due to Twitter, as the standard argument goes, but rather a general lack of attention span. I think of it in relation to the (multi-leveled!) news crawls at the bottom of news channels:

Actually, it goes back to the Fall of 2005, when a blog post by some n00b in teh blogosphere was written attacking Seth Godin's blogpost on the shift away from books and heavy reading to the easier to digest, rapid little blurbs. That post back then in history was titled "Sound bites? Books as Pizza" or some such nonsense and M. Godin left a comment to the effect that the n00b might have a point re: the need for thoughtful and contextual underpinning of one's offerings to ponder and chew and mull over.posted by infini at 8:35 AM on July 22, 2012

I wonder if being an internet (anti)intellectual is participating in a controversy that doesn't exist?posted by hellslinger at 8:36 AM on July 22, 2012 [1 favorite]

One man's public intellectual is another man's elitist asshole. I wasn't aware we were suffering from a dearth of assholes.posted by deathpanels at 8:36 AM on July 22, 2012

Isn't that the internet itself?posted by infini at 8:36 AM on July 22, 2012

One more thing, I have drunk many beers with Nathan Jurgenson, and I can say definitively he is no fake intellectual--dude is sharp as shit. Make sure to read his response to Sanger's Geek Anti-Intellectualism piece here.posted by Potomac Avenue at 8:42 AM on July 22, 2012 [1 favorite]

Potomac Avenue, I hope he joins us here. I wanted to write him an email on seeing "where is the Marshall McLuhan (social media)?" I recently asked that very question but not with the qualifier of social media (Jurgenson's own field of interest) but in general and in the way communications technology was changing things from the PoV of the mobile internet revolution taking place like here in Kenya.posted by infini at 8:45 AM on July 22, 2012

Well, in the article in question it doesn't take two paragraphs for him to wheel out a sneering sentence that opens with the word "one"; "One cannot fault Kirk for thinking too small."

You have got to be fucking kidding me. He uses "one" rather than "you"? Horrors.posted by kenko at 8:51 AM on July 22, 2012 [7 favorites]

No anti-intellectualism here! Just don't use academic language conventions or a footnote or a subtitle or any word I ain't never heard otherwise I'll call you a douchebag.posted by Potomac Avenue at 8:53 AM on July 22, 2012 [9 favorites]

I dunno, every time I read something somewhere that states that humanities scholars should be doing x it is always a bit of a wtf moment as they generally already are. Sure they could probably be doing it better and on a wider scale but that's true of all things.

In my particular interdisciplinary area that has a substantial humanities slant there have been blogs all over the place by academics looking to reach a wider audience. In my more general media and communication studies area there is a long standing tradition of both writing articles for a mainstream audience and a lot of focus on involvement in non-academic policy debates. I have gigs of history lecture podcasts put up by academic historians. I know the image of the crusty old tenured professor in their ivory tower pursing their particular sub sub sub-field of research is enduring, but it is also kind of frustrating.posted by Hello, I'm David McGahan at 8:56 AM on July 22, 2012 [1 favorite]

Footnotes are important because academic literature is a process of interacting with other scholars. Footnotes allow you to understand the ounces from which the author is drawing their support or opposition in conveying the idea they are discussing, or to explain something in the text that is relevant but not critical to the central theme. They clarify the exact situation of the author's thought in relation to the ret of their field.

The fact that TED talks don't use them is all of a piece in my mind with the way those discussions tend to exude an ahistorical 'no one else has ever thought of this!' self satisfaction with triviality.posted by winna at 9:10 AM on July 22, 2012 [7 favorites]

That 'ounces' should be 'sources' if the iPhone didn't love to autocorrect words at the most annoying times.posted by winna at 9:13 AM on July 22, 2012

Anglo-American (and also Au/NZ) Media Studies departments often seem hung up on a rather outmoded discourse around mass media, usually based on analysis of 20th Century TV culture. Much of the debate ends up concerning what amount to PR and advertising strategies.

There are a number of academics who are looking at things from a much different angle. People like Hutahmo, Parikka, Wolfgang Ernst, Matthew Fuller, Wendy Chun, Sigfried Zielinsky and Kathrine Hayles. Starting with Kittler in the 80s these folks actually go beyond investigating the culture that emerges from new media technologies. Instead they take it a step further to examine the technology itself and the origins of media. They deconstruct the designs, protocols and informing philosophies of the engineers, coders and early adopters. It may seem a bit esoteric but I'm really into the things that people are digging up using "Media Archaeology".

The other, completely different, and less academic direction that I dig lately comes from the people who are doing "Digital Folklore" like Dragan Espenschied and Olia Lialina. They are interested in the visual culture and discourse communities of non-corporate internet culture like fan-boards, animated gifs, mailing lists, etc.posted by mr.ersatz at 9:15 AM on July 22, 2012 [5 favorites]

The fact that TED talks don't use them is all of a piece in my mind with the way those discussions tend to exude an ahistorical 'no one else has ever thought of this!' self satisfaction with triviality.

Against TED - The New Enquiry
What began as something spontaneous and unique has today become a parody of itself. What was exceptional and emergent in the realm of ideas has been bottled, packaged, and sold back to us over and over again. The whole TED vibe has come to resemble a sales pitch.

The article that led me through the looking glass of Jorgenson's work btw.posted by infini at 9:19 AM on July 22, 2012 [1 favorite]

It’s because there is a very big difference between a statement occurring in a database and someone having, or learning, a piece of knowledge. If all human beings died out, there would be no knowledge left even if all libraries and the whole Internet survived. Knowledge exists only inside people’s heads. It is created not by being accessed in a database search, but by being learned and mastered. A collection of Wikipedia articles about physics contains text; the mind of a physicist contains knowledge.

Let's just take stock of what Mr. Sanger is saying here. If we believe this premise, that knowledge is only knowledge if it's inside a human brain and not when it's recorded in symbols, then libraries are essentially worthless resources for an individual. A library is instead a vast resource for teachers to call upon when constructing lessons; it is not a source of knowledge or understanding. Knowledge can only be transmitted from teacher to pupil, handed down by an oral tradition. The only kind of valid knowledge is then working knowledge.

Only an angry PhD could ever possibly believe this knee-jerk epistemology. For starters, we can point to a number of historical occasions when the libraries burned down and knowledge was lost, and an equal number of the reverse event: a library is rediscovered, and knowledge comes back into being after decades of being lost. Mr. Sanger's idea of the role of archives in the persistence and growth of human knowledge is grossly wrong. He correctly differentiates "living knowledge" (knowledge that exists in a person's head, being applied to current problems) from archival knowledge that sits dormant in libraries (and yes, the internet as well) until a human actor comes along to "make it real". But he exalts the former over the latter, and that is where, as we say in the world of anti-intellectual internet commentators, I call bullshit.

For the plebeians, those of us that haven't spent most of our lives "creating" living knowledge, libraries and publications are the best source for self-directed learning. I don't have access to a PhD in literature if I want to learn about literature – but I have a library card, and the skills to perform research that allows me to function as my own instructor, within certain topics. If I didn't possess those skills, I would have very little "living" knowledge indeed.

It is only natural for Mr. Sanger to believe what he's saying in this paragraph because it places his job as an academic and an instructor at the locus of human society – specifically, it puts him a couple rungs above the Silicon Valley elites. But he is arguing for a non-real situation exactly as much as his counterparts in the Wikimedia foundation who want to connect our brains to "the cloud" and stop "living" knowledge entirely. Archives, be they digital or not, aren't worthless. They're extremely important to the kind of society we live in. And throwing libraries under the bus in the effort of discrediting Wikipedia is poor scholarship.posted by deathpanels at 9:22 AM on July 22, 2012 [1 favorite]

every time I read something somewhere that states that humanities scholars should be doing x it is always a bit of a wtf moment as they generally already are.

For sure, and the Walt piece is equally baffling to me — but often these arguments become quite comprehensible if you just mentally substitute "should be institutionally recognized/rewarded for doing x" instead. It is certainly the case that many humanities people blog about and do a lot of other Internet-based discussion of their fields (this thread says hi); it is much less often the case that their doing this receives any kind of real recognition or reward either within or outside the academy. Indeed it's much more often the case that this thread, and every other discussion like it (up to and including the FPP link and arguably even Morozov's TNR review) are really "stealing" writing time that, if their authors were truly perfectly rational, would be going into peer-reviewed articles and print-publisher-bound books instead. In fact it probably goes right to Morozov's "book that should have stayed a tweet" point that books like Jarvis's (or equally well The Net Delusion), in the currency of the "public intellectual" world that Jarvis inhabits as well as the CV currency of the academy, somehow seem to count as more serious and weighty works merely because their lightweight arguments are wrapped in print-codex form.posted by RogerB at 9:24 AM on July 22, 2012 [2 favorites]

And sometimes you have to use footnotes to point out that you are using a word in *this* way, that is associated with a particular intellectual tradition and not *that* way, which is associated with another and which would entail a whole bunch of connotative meaning that would give a completely different interpretation to what you have written. Which otherwise would lead to a whole bunch of misunderstanding and pointless argument.

On preview - mr.ersatz, yeah there is still a bit of an over focus on the idea of the passive audience which tends towards patronising the everyday consumer of media, but for mine at least it is in decline. Of course the problem is that understanding the impact and influence of media messages ends up being an even more complex and difficult to unravel phenomenon, and we end up with yet another layer of theory that has to be communicated.

I've recently found the domestication theory that Silverstone and associated researchers initiated very insightful in understanding the complex interactions between economic factors, the actual uses made of media technologies, the socio-cultural context, and the work that is done by designers and engineers on the technical/technological side of things. The popularity of mobile phone text messaging is an interesting case study of how these kind of things can play out in unexpected ways. Importantly it takes into account the cyclic nature of media technologies, and how the expectations concerning use, by both designers of media technologies and users of media technologies, are open understandings that shift all the time.posted by Hello, I'm David McGahan at 9:28 AM on July 22, 2012

RogerB I agree completely, dissemination of current research to wider audiences should be better recognised and rewarded as a key function of academia. The sciences get this I think with the efforts made through documentaries and other forms of mainstream media. Though the relative ease with which climate science has been questioned and to an extent discredited in popular discourse should give us all pause about the position of expertise - it is not unthinkable that the sciences could find themselves in a similar position to what the humanities occupy at the moment if the cards fall in certain ways.

Anyway, perhaps it is a good thing that humanities academics are often not entirely perfectly rational about outcomes. Case in point. I am supposed to be waking up from a good nights sleep in about 3 hours or so in order to work on a journal article that will be institutionally recognised and rewarded.posted by Hello, I'm David McGahan at 9:43 AM on July 22, 2012

jeffburdges, am I understanding your comment correctly if I think you're suggesting that footnotes are in and of themselves almost always evidence of poor writing? And that writing in the humanities shouldn't use or defend them because STEM-specific writing practices discourage them?

Because I find that... baffling.posted by hank_14 at 9:46 AM on July 22, 2012

That's because most STEM writers aren't as good at writing and require these strictures. A typical rite of passage for anyone in the humanities is to earn extra cash and favours helping non-humanities types write coherently.posted by mobunited at 9:55 AM on July 22, 2012 [3 favorites]

Archives, be they digital or not, aren't worthless. They're extremely important to the kind of society we live in.

I wonder: is it at all significant that little Maker's Mark bottles are popping up on the bottom left of my screen as I read Morozov's review?posted by hank_14 at 10:17 AM on July 22, 2012

W/r/t internet culture, as an erstwhile journalist I have been concerned since the earliest days about the HURF DURF OLD MEDIA unthinking slapdown of news organizations.

And well you should be, as that isn't internet culture, that's rightwing blogosphere triumphalism.

As you know Bob, after the September 11 attacks quite a few of the earliest bloggers, who days before would've been more interested in the latest Microsoft release than in politics, went somewhat loopy and turned rightwing. Dissatisfaction with the traditional media (which ironically had been fed by Fox News and rightwing talk radio) led them to declare themselves as the future of news. This reached its highpoint with the whole "kerning" debacle where these geniuses claimed they could tell from a not very good screenshot of a particular document that the typewriter used was all wrong and therefore Bush had not gone AWOL and Dan Rather was a big poopiehead. That's where Jarvis comes from, having been on wingnut welfare for the better part of the last decade. Of course his argumentation is shallow; that's the whole.

But that's a political choice, not something that's inherent to the media in which he works. Drawing any conclusions from this without taking this political context into consideration is wrong.posted by MartinWisse at 10:19 AM on July 22, 2012 [1 favorite]

People who complain about anyone's lack of rigor should not be arguing in generalizations supported by straw men.

I get the sense that a lot of geeks are acting–quite unusually for them–defensively, because I’ve presented them with a sobering truth about themselves that they hadn’t realized. Consequently they’ve been unusually thoughtful and polite. What struck me about these discussions was the unusually earnest attempts, in most cases, to come to grips with some of the issues I raised. [...] Of course, there has been some of the usual slagging from the haters, and a fair number of not-very-bright responses, but an unusually high proportion of signal, some of it quite insightful. Reminds me of some old college seminars, maybe.

WTF, I don't even. Reality is slapping him in the face but instead he concludes that he is a messiah who has shocked the rabble into self-awareness of their own pitiful natures.

And does he consider that an example of the intellectualism which "geeks" should emulate?

This smells like the "humanities vs. science" culture war. I hope they remember how that went for them.posted by 0xdeadc0de at 10:29 AM on July 22, 2012 [1 favorite]

That's as moronic as believing you should only write reports in the passive tense. Or never in the passive tense.

Footnotes are great tools for those who know how to use them, ie. to provide references without breaking up the fool of the main text, showcase interesting digressions and exceptions to your main thrust or let your heroes communicate with each other inside books.posted by MartinWisse at 10:31 AM on July 22, 2012 [2 favorites]

You have got to be fucking kidding me. He uses "one" rather than "you"? Horrors.

While I commend you, in this thread about the menace of nerd anti-intellectualism, for responding to the first line of my comment and ignoring the rest of it, as I said in the very next sentence, that's not an offense per se. It's one of nature's warning signs, like the brightly-coloured banding on a poisonous insect. Take a look at this:

The Dewey-Lippmann debate, which broached many of these issues almost a century ago, goes completely unmentioned. Bruno Latour’s more recent attempts to produce a political theory that could account for the emergence of issue-oriented and object-oriented publics is nowhere to be seen. All we get are some glimpses of Habermas.

That is not an argument: it's a half-assed combination of name dropping and concern trolling. And the entire article is full of that. It's hard to believe that Mozorov's claims couldn't have been delivered without the relentless personal attacks and drive-by citations masquerading as arguments, but that's where we are.

He may well have a real and very important point, but this consistently awful delivery makes it basically impossible to know for sure. And as a result, as far I can tell, he comes across as somebody standing on a corner handing out conspiracy pamphlets who happens to own a closet full of tweed.posted by mhoye at 10:53 AM on July 22, 2012 [3 favorites]

Trantor, QED.

I'd also like the record to show that I read this in the same tone of voice Strongbad used to sing about Trogdor.

Now that we're reaching a seriously constrained time in which a few media conglomerates dominate reporting, and most other news sources crib from a very few news production outlets, we're finally starting to hear some "uh oh, did we mean to do that?" hesitancy on the part of the crowd that at one point naively believe information wanted to be free, and that people wanted to create it for free.

This seems to imply that "the crowd" who supported eliminating barriers to dissemination of information has somehow caused media consolidation. I'm not sure I see how that's true.

The lament at heart in this article seems to be the same--why aren't the "good" sources of commentary/news more popular? In other words, why is the public so dumb? Which brings us to the circle: the public is dumb because it does not care about the "good" sources, and it does not care about the "good" sources because it is dumb.posted by cheburashka at 11:07 AM on July 22, 2012

I think it is absolutely true that geek culture is anti-intellectual. It's an inheritance from an engineering culture that values application over theory, coupled with a business-oriented mindset that's constantly demanding to see the ROI of everything. So we get the usual criticisms of academics: they are eggheads stuck in their ivory towers, out of touch with the needs of the average person and so on. The point of this post is to engage in some of these same attacks -- Jurgenson argues that the lack of public intellectuals is the fault of academics and their poor writing skills, not a culture that's overwhelmingly hostile to anyone who acts as a public intellectual.

This is not unexpected -- to be a media expert, you first have to pander to the public's distate for expertise and take part in bashing it. Jurgenson wants to be a media expert. It's lazy to accuse everyone who's in the media of attention-seeking - if someone has something interesting or new to say, that's forgivable. But in practically all of his writings these days, Jurgenson cannot resist including a self-promoting paragraph or two about how he coined the phrase "digital dualism" -- a term of limited value that basically functions as a pejorative for people he disagrees with.posted by AlsoMike at 11:10 AM on July 22, 2012 [7 favorites]

Trantor, QED.

I'd also like the record to show that I read this in the same tone of voice Strongbad used to sing about Trogdor.

TRANTOR! Archivating the countryside!

*gasps at the depths of anti whatsits we've all sunk to*

*will return with intelligent argumentation on key points above after the sunday evening whisky*posted by infini at 11:17 AM on July 22, 2012

He may well have a real and very important point, but this consistently awful delivery makes it basically impossible to know for sure.

His point is simple an obvious: That Jarvis' book is shallow. He's not trying to prove Jarvis' wrong by fully refuting Jarvis' arguments point by point; he's saying the book is bad because it does not acknowledge any of the previous thinkers who have dealt with these arguments, but instead treats them as if Jarvis was the first to think of them. What you call "name dropping" is the point.

I mean, if I was writing a book about baseball and had a chapter in it on controversies, it'd be a fair critique to say "Dv calls Clemens 'undoubtably the most reviled figure in baseball history' but makes no mention of Ty Cobb, Shoeless Joe Jackson or Pete Rose." I could still be right about Clemens, but the fact that I skip over other huge and arguably more serious scandals suggests that I'm either ignorant of or glossing over important aspects of my chosen subject in a way that ought to cast doubt on my expertise. If you want to prove me wrong about the relative importance of the steriods scandal vs the Black Sox, then that calls for one kind of argument. But if you just want to prove I'm a shitty baseball historian, the fact that I don't mention the Black Sox at all suffices.posted by Diablevert at 11:21 AM on July 22, 2012 [2 favorites]

While I commend you, in this thread about the menace of nerd anti-intellectualism, for responding to the first line of my comment and ignoring the rest of it, as I said in the very next sentence, that's not an offense per se. It's one of nature's warning signs, like the brightly-coloured banding on a poisonous insect.

If you didn't think it was significant you should probably not have led with it.

The use of 'one' is indicative of particular intellectual tradition, and rooted in a particular form of education. I would suggest you are not a big fan of this tradition or the people it has produced, but this is not indicative that their outputs are without value but rather of baggage that you have about their social class. This is not a form of construction I tend to use but I have read plenty of literature which does and which has been cogent and valid.posted by biffa at 11:30 AM on July 22, 2012

mhoye, I don't think you're giving Morozov a fair shake by saying he's all ad hominem. Maybe he jumps too happily on examples of Jarvis personal hypocrisy re: his ideas on privacy, but he also engages pretty directly with those ideas themselves and with Jarvis' overall intellectual approach. For example:

THERE IS NOT much consistency in Jarvis’s thought about technology. Whenever he needs to explain something positive, his instinct is always to credit the Internet: it is the one factor responsible for more publicness, more democracy, more freedom. And every time he turns to darker and more difficult subjects—like discrimination, or shame—he announces that they have nothing to do with the Internet and are simply the product of outdated social mores or ineffective politics.

Which is followed by examples from the books. Or, how about:

[Jarvis] chides privacy advocates for focusing on edge cases, such as teenagers who are ostracized because their private videos appear online—“this debate tends to be held around the extremes.... edge cases are good at feeding debates but not at informing norms”—but then he proceeds to build the case for “publicness” entirely with edge cases. How normal are Howard Stern, the “New York gadabout” Julia Allison, Oprah Winfrey, and Josh Harris ofWe Live in Public fame? Are any of them “informing norms” that would apply to an unemployed and uninsured single mother from Iowa?

This is valid and straightforward criticism. It's about what Jarvis said, not who he is.

>He may well have a real and very important point, but this consistently awful delivery makes it basically impossible to know for sure.

The writing weeps for an editor, but here's the tl;dr.

Why worry about the growing dominance of such digitalism? The reason should be obvious. As Internet-driven explanations crowd out everything else, our entire vocabulary is being re-defined. Collaboration is re-interpreted through the prism of Wikipedia; communication, through the prism of social networking; democratic participation, through the prism of crowd-sourcing; cosmopolitanism, through the prism of reading the blogs of exotic “others”; political upheaval, through the prism of the so-called Twitter revolutions.

Although it sounds a little ridiculous when you read it all by itself, within the scope of what Morozov's saying that's a concern I share -- especially the "Twitter revolutions" part -- and I'm always grateful to hear criticism like this that isn't godawful clickbait about how the internet is ruining everything forever.

Footnotes are a technical way of solving a common problem - you need the argument to flow, but there are complexities which need noting without distracting from the flow. Not surprisingly, today, we often solve that technical problem thanks to what else, but the evolution of technology - on the internet! - hypertext... filled with hyperlinks - the modern form of footnotes.

That said, footnotes are a failure - in the anglophone world. Not due to any inherent characteristic of footnotes, but to their mode of employment. I will never understand, as long as I live, why oh why, do American and British publishers insist on taking all the footnotes and clumping them together at the end of the book. It's as moronic as having a dictionary in which the terms to be defined are in one volume, and the actual definitions are in another one. It's so inefficient it must have been designed in a time of energy surpluses - my arm is tired from the endless flipping back and forth with every footnote, I need an extra meal just to replenish the energy wasted. I would not be surprised if the majority of global warming is due to this appalling waste of energy - where of course, yet again, America is the big wastrel.

The way to do footnotes, if you have any sense at all, is how it's down in f.ex. Germany - admittedly, I'm only familiar with publications in the philosophy field in German - but the footnotes are at the BOTTOM OF THE FUCKING PAGE where they are referenced to in the main text of the page, and in small print. Now, this is not a perfect solution and sometimes there are amusing situations where the footnotes are so extensive that the main text is just a sentence or so, and 90% of the page is taken up by small print footnotes, but this is the minority. Most of the time it works exactly as the term implies - it is at the FOOT of the page, where you can refer to it immediately should you choose to do so, instead of flipping the whole book or having to keep a finger at the back of the book and following which page the footnotes have moved to now in chapter X.

I love footnotes when they are FOOTnotes and they are extremely useful. But not when they are as happens mostly in anglophone books, BACKnotes.posted by VikingSword at 12:23 PM on July 22, 2012 [4 favorites]

Somewhere under all this is a megalol, involving people who practice Cultural Studies arguing with each other about who's more academically-rigorous and who's less intellectual.posted by genghis at 12:28 PM on July 22, 2012

And underneath that megalol is another deeper, turtle-all-the-way-down megalol about someone lumping a bunch of different thoughts and thinker together, eliminating nuance and difference, and then dismissing the whole bunch under the italicized label "Cultural Studies." Ho ho ho.posted by hank_14 at 2:21 PM on July 22, 2012 [2 favorites]

As Internet-driven explanations crowd out everything else, our entire vocabulary is being re-defined.

Speaking of that, I don't think it's very wise to move the discussion of anti-intellectualism away from currently representing fundamentalist deniers of global warming and over-population, and haters of public education. Those are blatant instances of anti-intellectualism with real political consequences. There's a major difference between Wikipedia bullies who think they know more than a college professor (but insist on quoting them) and the truly stupid who think the professors are evil, therefore wrong.posted by Brian B. at 2:27 PM on July 22, 2012 [2 favorites]

Part of the problem may be that Jarvis writes as if he's the Internet Ambassador to the IRL public, and I think the two essays are looking for intellectualism in the wrong places, ironically - social networks, pop-nonfiction (and, as a side note, the snarky / patronizong tone and intellectual bullying of both essays are really revealing of the author's own biases). The truth is, individual curiosities will dictate one's quest for increasingly specific levels of knowledge, and the "deep" discussion on almost any topic is there for anyone who wants to follow the breadcrumbs - intellectual discourse just appears comparitively invisible, because the lowest common denominator on the Internet is kitten movies. Everybody loves kitten movies. But you're not going to be able to make valid cultural inferences about the entire Internet by looking at twitter.

To play Devil's Advocate - the very existence of Twitter represents an anti-intellectual imparitive; but I'd argue back, Twitter is a reaction to the dominant (2006) notion of Internet as Information Superhighway - it creates a social space distinctly outside of the realm of research. In doing so, it has certainly appeared to "dilute" the intellectual value of Internet culture - undoubtedly - but it's not as if people stopped using the Internet as soon as Twitter became a thing; more likely, people blah blah blah blah which is why I think you should purchase this bottle of dick pills.posted by bxyldy at 2:51 PM on July 22, 2012

Footnotes at the end of a book aren't footnotes but endnotes. I agree that they are an outrage from every perspective. Even more disgusting is when they're at the ends of the chapters instead of the end of the book, because then you don't even know where exactly to find them. It's reprehensible.posted by two or three cars parked under the stars at 3:03 PM on July 22, 2012

Have you guys ever read House of Leaves? It's a footnote-lover's wet dream, even more so than Infinite Jest.posted by bxyldy at 3:06 PM on July 22, 2012

the "deep" discussion on almost any topic is there for anyone who wants to follow the breadcrumbs

no one does, though, because that is boring

But you're not going to be able to make valid cultural inferences about the entire Internet by looking at twitter.

The original Larry Sanger essay is strange, with claims so broad they can't be conclusively argued for or against. His summary response to his critics, however, is dripping with the idea that intellectual = liberal arts. To quote:

If you haven’t studied philosophy, you can’t begin to understand the universe and our place in it–I don’t care how much theoretical physics you’ve studied.

to which a commenter rightly answered

I do find such a statement very dismissive of experts in the study of the Universe (aka cosmology) which requires deep knowledge of theoretical physics. By making the above statement, you are guilty yourself of what you accuse “anti-intellectual geeks” of doing.

I would counter by saying: if you have not studied advanced theoretical physics, if you are not aware of all the known facts about the Universe (including having a deep understanding of the physical laws that describe it) and of their limitations, then you can’t pretend to understand the Universe and our place in it, regardless of how much philosophy you have studied.

Have you heard of the “No true Scotsman” fallacy? Take a look at your response to the claim (#10) that the liberal arts are a waste of time: If you believe this way, then I have to point out that virtually any *really* educated person will disagree with you.” You seem to be saying that no *true* intellectual would agree with this claim.

I would argue that while Chaucer and such have their value and can bring joy, people who don't know multivariate calculus don't get to speak up. You can't understand anything in science -- nor statistical results nor theory -- without a working knowledge of first-college-year mathematics. You don't get a microphone to be snotty about how people are all about their coding if you don't understand code -- and doing a geeky html+php thing is not coding, that's something I did as a teenager for my parents' proto-ecommerce -- particularly because you can't understand algorithms if you don't know asymptotics.

If we are to have experts and intellectuals and such, the sine qua non isn't Hegel nor Goethe -- it's Tom motherfucking Apostol, do you speak it? People who don't know the calculus don't get a mic.

My liberal arts education, while somewhat rich -- my comment history will speak for itself -- was mostly all for waste, because we were always either in mode Hegel or mode Kant. Ten summers and ten thousands ago I went to film school, dude, and there was a vague notion that Deleuze wrote about cinema, which would have at least clued me to that direction, but alas, it was not to be still. I think G.D. would say that yes, the internet is full of idiots, but this essay is particularly idiotic because it fails to distinguish nomadic science from royal science -- and these have always existed, and always fed off one each other. The medieval blacksmith knew more about thermodynamics that all of the history of physics up to that point -- so tangled up in learning latin first....

I would argue that while Chaucer and such have their value and can bring joy, people who don't know multivariate calculus don't get to speak up. You can't understand anything in science -- nor statistical results nor theory -- without a working knowledge of first-college-year mathematics. You don't get a microphone to be snotty about how people are all about their coding if you don't understand code -- and doing a geeky html+php thing is not coding, that's something I did as a teenager for my parents' proto-ecommerce -- particularly because you can't understand algorithms if you don't know asymptotics.

I don't understand this. You're saying that a doesn't count as an intellectual and therefore is not qualified to discussed whether society needs intellectuals unless they know calculus? Or is it that they don't count as a geek, and shouldn't be able to speak to what geeks ought and ought not to know?

And what does any of this have to do with Sanger's argument? I have no idea whether or not he knows calculus, of course. But he's describing an attitude he sees in people he knows which derides the idea of expertise, which disagrees with the idea that it is useful to cultivate a specialized body of knowlege, and foolish to defer to someone who has acquired such, because so long as both parties have access to the same knowledge they are equivalent. He's talking about the idea "you don't know anything I can't google, so why should I pay any special attention to your opinions". I don't see anywhere where he says science isn't valuable or that it's not part of a classical education. Ditto math.posted by Diablevert at 5:35 PM on July 22, 2012

[LogicalDash] Yeah, that's another Scot.

Granted.

[Diablevert] I don't see anywhere where he says science isn't valuable or that it's not part of a classical education. Ditto math.

As I said, the text is dripping with insinuation. Just from how the response text relates to the original, you can see Sanger's strategy: write a polemic on terms that are so broad that a well-structured argument (for or against) that he can't deflect with No True Scotmans can't be made. He's shielded himself. And he is smart enough not to say that math is not part of an education.

The exchange with an actual cosmologist on whether you need philosophy to really unveil the secrets of the universe is telling, though. I'm kind of leftfield in this as a deleuzian, but I get the feeling that Sanger doesn't even understand what philosophy is for, only that it's a part of the non-STEM (to use a current buzzword) education that's abstruse and inaccessible in ways that are untouchable to the common criticism -- you may say Chaucer is boring, but not so of Leibniz.

As far as you're willing to accept a partial critique stemming from what his text implies between the lines, what drips from it when you squeeze it for meaning, I could paste the entire response essay. But note how all his examples come from the unwillingness of kids to study The Western Canon (he uses this phrase a number of times) and how his examples of anti-intellectualism come from people actually building things. And books, the obsession with books. Books and Chaucer are his litmus test for whether you are for or against the progress of civilization. Multivariate calculus and maybe some Lisp are mine. So where does that leave us?

Ultimately, and leaving the doomer-of-the-month behind (hell, better iterations of "oh the liberal education is dying and yet it's worthy appear thrice a week on Arts & Letters Daily) the problem with Wikipedia is that self-directed study along the web of links on, say, functional analysis is bound to be sub-optimal because you don't know beforehand the whole of the thing (dare I say holistic/emergent meaning?), but books are sub-optimal because the author has a preconception on how certain subjects should be presented, and if Michael Spivak decides I must boggle through R^n->R^n analysis before moving on to the calculus on manifolds, jesus, get to your point already. (Some books are excellent. I'm almost literally drowning in them). These are methods, trains are better than buses in some things and vice-versa, and I don't think anyone is in the position to prognosticate what will be the resultant in twenty years t ime. Jesus, radio is alive and better than ever in the form of podcasts.posted by syntaxfree at 8:00 PM on July 22, 2012

the problem with Wikipedia is that self-directed study along the web of links on, say, functional analysis is bound to be sub-optimal because you don't know beforehand the whole of the thing

This is the result of all of those, including M. Negreponte, who fervently believe that laptops and books dropped from a helicopter over a remote village will result in a new crop of scientists when they return to check up in 10 years.posted by infini at 11:15 PM on July 22, 2012 [1 favorite]

Tags

Share

About MetaFilter

MetaFilter is a weblog that anyone can contribute a link or a comment to. A typical weblog is one person posting their thoughts on the unique things they find on the web. This website exists to break down the barriers between people, to extend a weblog beyond just one person, and to foster discussion among its members.