Sunday, December 2, 2012

Hypable, which launched last year, is aimed at a Harry Potter-like audience of tweens, teens and young adults, mostly female. That means lots of coverage of the "Twilight" series and "The Hunger Games" and less emphasis on, say, "Star Wars," which Sims said attracts an older male audience.

[Professor Karen] North, at USC, said entertainment companies are similarly scrambling to harness the immense fan energy unleashed by sites such as Mugglenet.

Second, and the reason I'm posting: it's not quite clear whose language this is—Hinch's or North's—but it seems telling that fan labor is imagined as a sort of natural resource, whose "energy" is "unleashed" by fan sites and can be "harnessed" (read: profited on) by the entertainment industry. Of course this is labor that is being appropriated. What's interesting about the language through which it is understood, here, is that it so explicitly routes that appropriation through an identification of (primarily young female) labor with natural resources, imagined as free for the taking. This usually doesn't end well.

Sunday, November 11, 2012

About a week ago I wrote a post adding to my ongoing series on puerility, observing how the cultural phenomenon of the FiveThirtyEight blog and the conflicts surrounding it exemplified a discourse in which discrete, mutually exclusive outcomes are the only imaginable ones. Then, while I drove to Maryland for a workshop, stopping in Philadelphia on the way back, about eighty people commented on my post to let me know that they were persuaded (in error) that I was somehow defaming Nate Silver personally and statistics as a field, and that it was up to them to defend both.

This seems to me to suggest two things.

First, that the same logic that gives us "Obama or Romney?" as the paramount question one can ask about an election also gives us "for or against?" as the paramount question one could ask about the FiveThirtyEight blog. This is in no way an interesting question to me, but for many people it was the only question, and therefore my post could only be read as answering it. This reduction to discrete, mutually exclusive, and usually binary outcomes legible in the terms of a game is of course what I was identifying as a form of puerility in the first place. Obama or Romney? Statistics or "gut"? Nate Silver or Politico? Quants or scouts? These questions clearly generate a great deal of pleasure, as evidenced by the enthusiasm with which they are debated, but there are other questions, involving words like "why" and "how," that are worth discussing.

Second, that childhood is so overwhelmingly treated as a debased category that to invoke it is considered an insult.* In addition to its literal meaning of "boyish" (Latin puer), "puerile," in common usage, carries a pejorative connotation, of course, but that's its least descriptive and least useful aspect, which is why I set it aside. Puerility, in the sense of the performance of child masculinity, is one of the most powerful political forces in the present moment; that's why there is a "Nate Silver phenomenon" in the first place. It should go without saying that anyone can engage in this performance, but it is also worth noting (so I noted it) that Silver's public persona (white, male, youthful, virtuosic) makes him a particularly good candidate, out of the many people and organizations aggregating polls, to emerge as the celebrity of popular political statistics.

On election night it was interesting (though not surprising) to observe how, once the presidential race was called, Silver began to be celebrated on Twitter (elsewhere too, I'm sure, but Twitter is time-stamped) as if he were the magical wizard that, prior to the election, Silver himself so patiently tried to explain that he was not. A lot of the tweets were really funny (funniest), but many of them oddly called the Obama victory "a win for statistics" or even "a win for reality," as if to suggest that the validity of either were contingent on who won the election.** Some even explicitly trod into "Nate Silver IS a wizard!" territory:

Such celebrations seemed to concede (erroneously) that Scarborough et al. had a point in the first place— that a Romney win would have falsified Silver's model, and that Silver's model were based on occult wizardry rather than weighted averages of widely available polling data. As Siva Vaidhyanathan put it:

You realize, of course, that if Romney had won Nate Silver's prediction would still have been right.

And surely many of the people declaring Silver the real winner of the election knew this, and had even, prior to the election, said it. This put no damper on the explosion of Silver jokes, however; the pleasure of play trumped the basic premises of the very thing being celebrated. The cultural meaning of statistics was precisely puerile at this moment, openly signifying "winning team" more than it signified the actual principles of statistics.

Another form of data analysis was also declared a winner after the election, it is worth noting—the data-mining that enabled the Obama camp's microtargeted get-out-the-vote effort. This was swept into the same category as Silver's poll averages and made a cause for celebration. But as Zeynep Tufekçi, who had earlier argued that work like Silver's had the potential to limit the puerile logic of the horse race, observed, data-mining is ethically neutral at best, and is as eagerly pursued by Target as by the Democratic Party:

Winning by big data driven ground game & micro-messaging is appealing (not saying all that it was) but it's policy neutral. Who next time?

Or as Alexis Madrigal put it, "Data Doesn't Belong to the Democrats". "The left's celebrating the analytical method right now as if it belonged to them," Madrigal writes. "But it doesn't. [...T]his election was not a triumph of data over no data, of rigor over hunch. The 2012 election was a triumph of Democratic data over Republican data." What Madrigal predicts next is a data analysis war, as Republicans struggle to catch up with and exceed the facility already achieved by Democrats.

This is indeed probably what will happen in 2016, and it is about winning. In such a discursive environment, we can easily have another election in which drone strikes are not up for debate at all. But who wins—the question that FiveThirtyEight and the political parties' data-mining efforts each, in their different ways, attempt to answer***—only ultimately matters in the context of policy questions. Are we able to ask them?

-----
*This is complicated, to say the least.

**This is in contrast with the predictions in individual states, which, taken together, are rather more meaningful for evaluating the model.

***I.e., Silver tries to answer by prediction based on polling data, while the data-miners try to answer by trying to secure a particular outcome.

The "Nate Silver phenomenon" is a perfect example of Second Gilded Age puerility, a form of political commentary that is concerned not with meaning or ethics but rather with phenomenality, especially as translated into abstract forms, chief among them numbers. When I use "puerility" in this way, I don't mean it pejoratively but literally: this is a form of boyishness, as boyishness has been constructed in U.S. history. It's concerned first and foremost with abstract play—even a certain virtuosity with play—and it is entirely bound up its own game. And it is a game that may be a little ruthless, a game that implicitly must be played by a white, boyish figure, a Tom Sawyer who insists on playing even when a slave's freedom is at stake.* Silver's Wunderkind image creates kind of persona from whom we are prepared to receive statistical models; it is entirely appropriate that his statistical forecasting began not in politics but in sports.

Nate Silver's models can tell us how likely it is that Obama will "win" (the game). They can't, and absolutely do not aim to, explain, say, the role of race in the election. And they cannot give definitive predictions either, only probabilities: that's the point. Statistics always pulls back from the claims it makes; if it did not do so, it would not be statistics. Statistics is an inherently puerile discipline, not because it is dominated by men but because its principles concord so strongly with the way we have constructed boyhood—an unrelenting commitment to the play of abstract forms above all else: above wishes, above belief, above ethics, its only ethics being a commitment to the rules of the game. It presumes being unable to really know "the answer," except as defined and bounded by the game.

The Silver backlash has a huge problem with this. In the Politico piece that seems to crystallize the backlash, Dylan Byers quotes the NYT columnist David Brooks:

"I should treat polls as a fuzzy snapshot of a moment in time. I should not read them, and think I understand the future," he wrote. "If there’s one thing we know, it’s that even experts with fancy computer models are terrible at predicting human behavior."**

The key here is the word "understand." Brooks thinks that Silver thinks he "understands" the future. But understanding has nothing to do with it; there are simulations, and they indicate the probability of potential outcomes. It's not understanding; it's pointing.

The Defenders accuse the Backlashers of two things—ignorance of statistics and a reflexive personal hatred of Silver founded on defensiveness—and suggest that the two are, in essence, identical. The latter, it is interesting to note, is a highly psychologized accounting. In TechCrunch, Gregory Ferenstein writes,

Why does Silver, who is really just an apartisan puzzle-solver, inspire so much loathing? Because his results reveal a psychologically disturbing fact: we live in an uncontrollable, unpredictable world.

As far as Ferenstein is concerned, the reason Silver is being criticized is that he reveals a truth that some people can't handle. That truth is, essentially, that statistics is a valid method of producing knowledge about reality; in other words, decrying Nate Silver reveals an ignorance that is the same as a psychological weakness. Klein's take is less cosmic but equally psychological:

Lots of pundits don't like Nate Silver because he makes them feel innumerate. Then they criticize him and prove it.

Klein packs a slight dig at the Backlashers' masculinity in that phrase, "makes them feel innumerate." "Innumerate" is code for "inadequate," but a particular kind of inadequate; it's a castration complex built on an ignorance of statistics. Silver, as a methodologist and as a person, is "threatening" to traditional journalists (Ferenstein). The Defenders impute wrong feelings to the Backlashers, wrong feelings that are indistinguishable from wrong knowledge.

What is so interesting to me about the Defense—which is, if anything, more impassioned than the Backlash—is that it finds in the Backlashers a profound moral failing. Ferenstein's TechCrunch piece literally includes a picture of Galileo, calling up a larger-than-life myth of the forces of dogma unfairly pursuing a scientific crusader whose, as the Indigo Girls would have it, "crime was looking up the truth." Yup; Nate Silver, Galileo; I totally see it.

But the real failure of the Backlashers is a little more complex: not a moral failure, but rather a failure to be okay with the moral absence at the heart of statistical methods. The Silver backlash wants an answer, a position; it wants Silver to stop playing around. In other words, it reads statistics itself as waffling and double-tonguery. It's not wrong in that sense. It just fails to appreciate that that is more or less the entire point of statistics: to measure what is irreducibly uncertain.

There's certainly a basic failure to grok statistics that underwrites the comments by Joe Scarborough, David Brooks, et al. (the Backlash). And undoubtedly they are craven, miserable, petty people, as Ferenstein, Klein, and others suggest, although I have my doubts about proving the latter by way of the former.

But it's also important to break out of the puerility sandbox for a minute. We do not have to suppose, as Robert Schlesinger does in U.S. News, that the only alternative to "quants" is "gut feeling traditionalists" and "conventional wisdom"—that is, non-knowledge. There are good reasons to be wary of the statistical mode, if not reasons dreamed of by David Brooks, and they do not necessarily involve siding with the Ancients in a battle against the Moderns. Klein writes that

If you had to distill the work of a political pundit down to a single question, you’d have to pick the perennial “who will win the election?” [...] Now Silver—and Silver’s imitators and political scientists—are taking that question away from us. It would be shocking if the profession didn’t try and defend itself.

Perhaps so. But what if that weren't, after all, the question?

A Nieman Labdefense of Silver by Jonathan Stray celebrates that "FiveThirtyEight has set a new standard for horse race coverage" of elections. That this can be represented as an unqualified good speaks to the power of puerility in the present epistemological culture. But we oughtn't consider better horse race coverage the ultimate aim of knowledge; somehow we have inadvertently landed ourselves back in the world of sports. An election is not, in the end, a game. Coverage should not be reducible to who will win? Here are some other questions: How will the next administration govern? How will the election affect my reproductive health? When will women see equal representation in Congress? How will the U.S. extricate itself from permanent war, or will it even try? These are questions with real ethical resonance. FiveThirtyEight knows better than to try to answer with statistics.**** But we should still ask them, and try to answer them too.*****

Nate Silver's Log, 2016. I've been on the run for four years now, spreading polling averages to them that needs 'em most...

The first woman I have seen to comment on the Silver Wars is Margaret Sullivan, the New York Times's public editor, and it was to reprimand Silver for playing around—literally. Silver offered to make a bet with Backlasher Joe Scarborough about the outcome of the election. Aunt Polly Sullivan writes that "[i]n a phone conversation, Mr. Silver described the wager offer as 'half playful and half serious.'" Which is, of course, the essence of the appeal of FiveThirtyEight.

-----
*I am of course alluding to the ending of The Adventures of Huckleberry Finn.
**It's bad enough to have to link a Politico piece, but I'm not linking Brooks.
***I have not yet seen a Silver defense by a woman, although I suppose they must exist. I have seen many, prominently placed, by men, however.
****Silver's NYT colleagues, Dubner and Levitt, do not know better, unfortunately.
*****I am all too aware that a perfectly plausible gloss of this post would be: "don't hate the player; hate the game."

Monday, October 22, 2012

Once upon a time I would have blogged not only THATCamp Theory but also MSA. As it is, however, all I seem to have in me is an acknowledgment that the Las Vegas airport has free wifi. Which is, I suppose, something.

Tuesday, October 2, 2012

It's strange to see a conversation happen in your Twitter feed, mainly among people you know, and then watch that conversation get written up at Inside Higher Ed. Such was the recent "Twittergate," ironically so dubbed by Roopika Risam. The question was whether and under what circumstances it is ethical to live-tweet a conference.

I have a hard time taking the question seriously. I tend to sympathize with Eleanor Courtemanche's quip:

For me, #twittergate is ALWAYS that tweets from conferences are TOO BORING (exc. mine, & OK all those from #navsa12)

If we do any sort of public writing, whether on blogs, in print, or elsewhere, then we have had to make our peace with the partible personhood function of writing. You will be misread, misquoted, taken out of context, and distorted. And that's if you're lucky and are read. How dreary to be somebody!

Conferences are a sort of academic Facebook; they give the illusion of privacy and safety, while you're under a diffuse but constant surveillance. A certain segment of Facebook fans has a horror of Twitter and its public ways, because on Twitter that illusion of privacy is gone. But you know: it was only ever an illusion. Live-tweeting a conference only reveals and makes searchable (and renders amenable to response) the Telephone-relays already pervading our academic life.

I think the earliest blog post on the subject, Tressie McMillan Cottom's, is also the most interesting and nuanced. She addresses the most substantive critiques of live-tweeting: that it participates in the tendency of "openness" to devolve into commodification, and that it violates the speaker's expectations.

Still, the Inside Higher Ed article seems to unintentionally make the case that we have less to fear from Twitter than from journalism. As Alexis Lothian and Mark Sample observe, Twitter was responsible and careful where Inside Higher Ed was not.

I want to add only one thing, which irks me every time a conversation on the etiquette-ethics spectrum comes up. (That etiquette is so often discussed as an ethical imperative is itself, in my opinion, a problem, but a different one.) Inevitably there are calls for "BASIC MANNERS" and "COMMON CIVILITY" and "just don't be RUDE didn't your mama teach you better" and the like. I have no sympathy with the position that live-tweeting is just obviously rude.

There is no such thing as "basic manners." "Polite" (or socially affirming) in one context is rude in another, and vice versa. Or your mama may not have taught you "better." Maybe you had a bad mother; is that supposed to be the point? Why bring people's mothers into it? I like a fast conversation in which the conversants are so excited that they interrupt one another; I interrupt people, they interrupt me. Is this rude? Sometimes. Other times, as a friend once said to me when our interruptive conversation turned meta, "whatever; I'm from New York."

In Gender and Discourse, Deborah Tannen argues that the social meanings of linguistic acts are cultural, contextual, and mutually produced by conversants. Interruption can produce sensations of affirmation; the linguistic gestures of solidarity can be confining as well as affirming. "For example," she writes, "one can talk while another is talking in order to wrest the floor; this can be seen as a move motivated by power. Yet one can also talk along with another in order to show support and agreement; this must be seen as a move motivated by solidarity" (19). Moreover, "the two...are not mutually exclusive" (19).

So what is "basic manners"? If we take seriously the diversity of experience, then it's certainly not a universal baseline about which we get to wag our fingers. We need an academic bestseller on "the dorky art of faux-pas," because I feel that this fact is underappreciated. Academia remains deeply riven with racial and especially class codes that are absolutely inscrutable to, for instance, first-generation academics. ("Didn't your mother...?" "No, my mother has no notion of how to comport oneself at an academic conference, nor is she especially clear on what an academic conference is.") As Lisa Delpit has so powerfully shown, a white teacher's politeness can, to her black child student, be no more than a maddening obfuscation.

I taught Melville recently, so I'm going to leave the last word to Melville:

Shifting the barrow from my hand to his, he told me a funny story about the first wheelbarrow he had ever seen. It was in Sag Harbor. The owners of his ship, it seems, had lent him one, in which to carry his heavy chest to his boarding house. Not to seem ignorant about the thing—though in truth he was entirely so, concerning the precise way in which to manage the barrow—Queequeg puts his chest upon it; lashes it fast; and then shoulders the barrow and marches up the wharf. "Why," said I, "Queequeg, you might have known better than that, one would think. Didn't the people laugh?"

Upon this, he told me another story. The people of his island of Rokovoko, it seems, at their wedding feasts express the fragrant water of young cocoanuts into a large stained calabash like a punchbowl; and this punchbowl always forms the great central ornament on the braided mat where the feast is held. Now a certain grand merchant ship once touched at Rokovoko, and its commander—from all accounts, a very stately punctilious gentleman, at least for a sea captain—this commander was invited to the wedding feast of Queequeg's sister, a pretty young princess just turned of ten. Well; when all the wedding guests were assembled at the bride's bamboo cottage, this Captain marches in, and being assigned the post of honor, placed himself over against the punchbowl, and between the High Priest and his majesty the King, Queequeg's father. Grace being said,—for those people have their grace as well as we- though Queequeg told me that unlike us, who at such times look downwards to our platters, they, on the contrary, copying the ducks, glance upwards to the great Giver of all feasts—Grace, I say, being said, the High Priest opens the banquet by the immemorial ceremony of the island; that is, dipping his consecrated and consecrating fingers into the bowl before the blessed beverage circulates. Seeing himself placed next the Priest, and noting the ceremony, and thinking himself—being Captain of a ship—as having plain precedence over a mere island King, especially in the King's own house—the Captain coolly proceeds to wash his hands in the punch bowl;—taking it I suppose for a huge finger-glass. "Now," said Queequeg, "what you tink now?—Didn't our people laugh?"

—Moby-Dick, Ch. 13

-----

Delpit, Lisa. Other People’s Children: Cultural Conflict in the Classroom. New York: The New Press, 1995. Print.

Monday, August 27, 2012

Friends and Melville fans, I am teaching Moby-Dick this semester as part of a Readings in American Lit course.

My specialization centering some fifty years later, I am not one of those people who teaches this hoary beast (so to speak) every semester, so far from being bored by the prospect, I am thrilled.

I've mentioned this to a number of people, and many have replied that they've never read Moby-Dick. They've thought about it, but they've never had the occasion, or the time, or they've felt it was really the sort of thing you needed a course context within which to read it.

So I invite any interested parties to read Moby-Dick alongside me and my students, August 31 through September 19.

What would this entail?

1. The most important thing—really the only important thing—is reading the novel. A schedule is posted below.
2. My course blogs are sadly closed to the world, but I will post briefly here over the course of the three weeks; you're invited to read and comment. [UPDATE: it's possible my course blogs can become public. More soon.]
3. If you like, blog your own responses to the novel.
4. My course Twitter hashtag this semester is #f12ral (which stands for Fall 2012 Readings in American Lit). You are welcome to join in on any Twitter discussion, and if you blog about your reading, please tweet a link with the hashtag.

If you decide to read Moby-Dick with us, I'd very much like to hear from you about it. Drop me a line, leave a comment on this post, or tweet using the hashtag. If anyone decides to join us, I'll let my students know; I think they'll be tickled.

Most of the people who will encounter this post are not full-time students and will find the schedule a little onerous; it's 135 (short) chapters in just three weeks. You might fall behind in the reading, and that's no moral failing. All I can say is that it can be done (I think I had even less time to read it in college—maybe a week), and it will be done by my students, and maybe also by you.

The edition we are using is the Norton Critical. I will confess that I am not enormously a fan of this edition: neither the thin Bible-paper that makes the text bleed through from page to page, nor the distracting footnotes that turn out to be no more than dictionary definitions for relatively common words for, I suppose, potential readers with exceedingly narrow vocabularies. Nonetheless, this is the standard edition in my department and I did not have the choosing of it. Moreover, if you wish to read some contemporaries accusing Melville of lunacy, the Norton's got you covered. I see no reason to disdain the Penguin, however.

Saturday, August 18, 2012

It's been a few years since I last visited Chicago. I'm in town for a wedding, and for some reason the weather decided to be perfect. I know that there are others who will understand that although Chicago is a large city, full of museums and restaurants and interesting neighborhoods, there is something that always pulls me back into the Hyde Park vortex. Could it be the books?

When I went to the Seminary Co-op, which hasn't yet moved, I was shocked to see that they didn't have Jen Fleissner's book on the shelf, and they only had one of Ian Hacking's books on statistics in the history of statistics section. These are, indeed, outrageous omissions (Women, Compulsion, Modernity is a Chicago title, for FSM's sake!), but it was wonderful to be in a place where I could have such expectations in the first place. And it's easy to imagine that they were probably gone because someone in the neighborhood had recently bought them.

Powell's. <3>3>

The Regenstein Library, plus
the new robot-operated underground stacks.

Stephen Crane is a master of the particular version of puerility; I think in particular of the Whilomville story "Lynx-Hunting," in which a group of Whilomville boys, led by Jimmie Trescott, stolidly defend the town against a grazing cow.

Crane, too, is continually said to work in miniatures; thus Michael Fried reads the final scene in The Monster as a scene of "reading painfully what has already been written, with the stove representing a domesticated (in effect miniaturized) version of the catastrophic fire" (142). Indeed, Fried argues, "two opposing tendencies, one toward miniaturization and the other toward a certain monstrosity, coinhabit Crane's prose" (141). The same could be, and has been, said of Wes Anderson's filmmaking.

This brings us back to Noye's Fludde, the systematized, aestheticized miniature of the real flood happening outside, which in the film takes on the cosmic significance of the Noah's Flood, the narrator going into some detail about its historic devastation. At a certain point the real flood takes precedence over everything else, disrupting Noye's Fludde and revealing every system as miniature, as diminutive.

Such moments appear in Crane as well. In The Material Unconscious, Bill Brown addresses Crane's poetry only once, in order to reveal the dimension of childish play latent in "The Open Boat":

The ocean speaks the lines of the poem, asking that the weeping woman on shore be told that her lover is dead: "Her lover I have laid/ In cool green hall." The second and final stanza supplements the message:

"Tell her this
"And more,—
"That the king of the seas
"Weeps too, old, helpless man.
"The bustling fates
"Heap his hands with corpses
"until he stands like a child
"With Surplus of toys." (W, 10:22)

The lines intimate an understanding of life and death that would make the entirety of "The Open Boat" intelligible as "play".... (Brown 123-4)

The great fear is that there is no end to this regress, that there are, indeed, no grown-ups in the room. Not only are all the adults invested in miniature systems; the Cosmic Adults are so many babies as well, pulling the heads off dolls.

Thus in Crane's story "Death and the Child," the unaware toddler playing on a mountaintop, accidentally abandoned by the evacuating villagers, is possessed of a godlike perspective on the battle below. To him the action looks like a doodle, "fantastic smoky shapes" and "white circles and whirligigs" and "[l]ines of flame" (Crane 962). When young Peza, foolishly overeager for battle, reaches the mountaintop and finds himself face to face with this baby, it is the baby who is in a position to inquire, "Are you a man?"

Of course, we don't quite have the same fear that gods are babies in Anderson's films. The weather exerts its whims, but there is always ultimately a grown-up chaperoning things—Anderson himself. The craftedness of his miniatures remind us that somebody has things under control.

That that register of control—the aesthetic—is the same register as that of the miniature, e.g. the church production of Noye's Fludde, however, may give us a moment's pause. In the end this film is deeply sympathetic to the ridiculous seriousness with which children and especially adults invest their play. For in the film, aesthetic satisfaction appears to be the only available site of even fictive shelter. One can but work on that production of Noye's Fludde, or pull a crisis back into the realm of Khaki Scouting by inspecting the camp and issuing a Commendable.

As Crane writes in Black Riders:

If there is a witness to my little life,
To my tiny throes and struggles,
He sees a fool;
And it is not fine for gods to menace fools. (Crane 1303)

Monday, June 25, 2012

I forgot to mention in my previous post one of my favorite things about Moonrise Kingdom, which was the liberal use of Benjamin Britten's works in the soundtrack. Movie nerds will recognize the theme from his Young Person's Guide to the Orchestra as the Rondeau from Henry Purcell's Abdelazar, which was adapted (quite effectively, for solo violin) for a key scene in The Lesser Adaptation of Austen's Pride and Prejudice.

"I'm a raven."

The amateur production of Britten's Noye's Fludde, staged in a local church and replete with children dressed as animals, is, like the Khaki Scouts, its own kind of child-adult collusion in overenthusiasm. Suzy is a failed raven just as Sam is a failed Khaki Scout, failed on social grounds rather than out of incompetence. It is a "play" that is taken utterly seriously, especially by the grown-ups (like the one who demotes Suzy from her raven role). Play taken too seriously, or serious enterprises (like child care) rendered all too game-like (as when the best scout of all, Scoutmaster Ward, manages to lose first Sam, then the rest of the troop), continually threaten happiness. The only possible resistance is yet another system, an alternative game, a union between Sam's wilderness skills and Suzy's fantasy world, the game of their private Moonrise Kingdom.

Thus when all the other social systems of discipline converge, they do so at the church, in the midst of Noye's Fludde, in order to escape the actual flooding outside.

Britten is a serious, even difficult composer who has, when you think about it, written a great deal for children—both child audiences and child performers. He's especially known as a composer of liturgical music in a tradition famous for its boy choristers. I couldn't help noticing a movement from his Simple Symphony when it appeared in the film—a movement tellingly titled "Playful Pizzicato"—I'd played it as a child, after all. Using childish sounds—"playful" pizzicato (plucked strings), glockenspiels, high-pitched child voices, and at times almost comical bombast (including in the didactic Young Person's Guide)—Britten proffers Middle English texts and challenging harmonies. Thus, within the world of the film, his music marks the oscillations between "too easy" and "too hard" that marks every educational pursuit, every system for cultivating the self.

Anderson's own oeuvre, so often described in the terms of miniatures and toys (and including an animated adaptation of a children's book, Roald Dahl's 1970 Fantastic Mister Fox) aspires to similar comminglings. In Moonrise Kingdom, Anderson focalizes childhood as a site of real difficulty, one whose difficulties are not discontinuous with those of adulthood, and indeed, one whose difficulties are most adult when they reside in the domain of play.

Suzy lugs a suitcase of stolen library books through the wilderness, imaginative resources for building a private universe. Her fictions are bulwarks against the flood.

Sunday, June 24, 2012

There is hardly a clearer example of Foucauldian power than the Boy Scouts—a most codified set of techniques of the self, each self a set of badges pinned to the uniform.

I saw Moonrise Kingdom last night; the centrality of its "Khaki Scouts" highlights the way in which overinvestment in such systems conduces to tragicomedy, especially in Wes Anderson's films (The Life Aquatic, The Royal Tenenbaums, and Rushmore function similarly).

Particularly notable to me are the juxtapositions of adults and children; there is often intergenerational buy-in. There is something truly hilarious about a group of boys taking their scouting very seriously. Funnier still is the truest true believer, the adult Scoutmaster Ward, who avers that being a scout master is his real job: "I teach math on the side." But the boys' and men's beliefs exist in the same plane; Sam's adorable self-importance as he gives Suzy camping tips (of very widely varying utility) is later validated by his scout master's manner of offering sympathy: "I wish we'd had time for an inspection back there. I would have given you a Commendable."

As ridiculous as the Khaki Scouts are, they are soon revealed to be no more ridiculous than the other disciplinary institutions that they mimic—the law, as figured by Suzy's lawyer parents; the state, as figured by Social Services (Tilda Swinton, in some of the film's most visually striking moments—of course); and perhaps the most absurd of them all, the police, as figured by Commander Sharp. In a climactic scene, all four avatars of systematized discipline bark into walkie-talkies attempting to sort out the proper placement of the two children, four criss-crossing domains of authority emblematized by five bewildered—but still entirely invested—adults. At the end, when Sharp agrees to foster the orphaned Sam, Sam switches out the Khaki Scouts uniform in which we have always seen him for a miniature police uniform. He has merely switched systems.

In Three Guineas, Virginia Woolf mocks men's love of fancy dress by pointing to the pomp and circumstance of the military. Busby Berkeley's Footlight Parade (1933) contains a military dance sequence ("Shanghai Lil") that indeed quite undermines any distinction between the military and the Tiller Girls when it comes to examples of the mass ornament. What I am getting at is that there is a pettiness in these systems—and it is precisely the pettiness that interests Anderson—that we may identify as a form of puerility. It is "boyish" behavior, both highly elaborated and ridiculous, even if adults are frequently the originators of that puerility. (What is The Life Aquatic if not a story about a man playing with the people and things around him as if they were so many toys?) These are systems of play entered into for their own sake, and prioritized regardless of the consequences. (A pet dog is killed in one encounter; when confronted with this fact, Sam's nemesis shrugs and says something to the effect that it can't be helped; the dog is a casualty of war.)

Puerility and its powerful appeal—its necessity, even—is one of Anderson's continual themes. Why do people invest themselves in ridiculous systems? When is such investment reprehensible? From what standpoint is one capable of distinguishing between puerility and grandeur of vision—or does any such distinction exist?

Thursday, June 21, 2012

In Electric Animal, Akira Mizuta Lippit identifies the animal cry as the limit point beyond speech:

The animal cry signals the moment of contact between those two ontic worlds: the cry is, as Derrida explains, a signal burdened with the antidiscursive force of animality and madness. Burke's 1757 reflection on the sublime includes a section on "The Cries of Animals." For Burke, the experience of the sublime aroused by the animal's cry imposes a moment wholly outside time—an extemporaneous moment—in which the dynamics of reason are temporarily halted. (43)

Paul Klee, Twittering Machine, 1922

Yet a long philosophical tradition (including Kant) also locates in the animal cry the source of human speech, insofar as speech is imagined as originating in the mimicry of animal sounds (Lippit 41).

So there is something coy about the way that Twitter names itself after animal sounds,
as if to suggest that there is something fundamentally antilinguistic about social media text. "Don't mind us," it seems to say; "we're just twittering, like animals. No language to see here."

I think that in some cases this makes people feel as though they have to live up to a kind of antilinguistic standard on Twitter, to introduce noise gratuitously as if in homage to the medium—as if to make it really tweeting. That's the only explanation I can think of for tweets like this one from Senator Chuck Grassley (R-Iowa; @ChuckGrassley), whom I imagine writes like this only on Twitter:

In contrast with common abbreviations and slang, which are underwritten by identifiable (if diverse) logics, here the abbreviations and nonstandardisms seem random, even perverse. For example, "evr." does not save any characters; a period would seem better spent at the end of the first, unstopped sentence. As for the wasted space before the question mark or the capitalized "Learn"—what can these be but antilinguistic performances? (I was interested to learn, incidentally, that Sen. Grassley shares my hatred of the "History" Channel.)

T. S. Eliot, from The Waste Land, 1922

The fact that the animal in question with Twitter is the bird adds another dimension to consider. Bird-talk is gendered feminine, from the speechless Philomel ("twit twit twit"—she is turned into a bird to enforce her speechlessness, when cutting out her tongue is not enough) to the cheeping and twittering of the town women in The Music Man:

It's no wonder Twitter is seen as a site of gossip and rumor, intellectual triviality and linguistic disaster. It intentionally casts itself as mere animal noises, or, what evidently amounts to the same thing, female speech. And as I've suggested elsewhere, the radical multiplicity of voices on Twitter likewise suggests a flock indiscriminately cheeping.

This is undoubtedly the source of the fears that are occasionally raised that social media are making us lose our grip on language, as if that were a thing that could be so easily lost. (Try writing like Gertrude Stein. It's not easy.) To lose language might just be to lose our humanity, and then where would we be?

Well, the posthuman turn is so five years ago that it's difficult to get exercised about such a question. The interesting implications do not lie in fears of loss, for we are all already cyborgs or animals.

When the birds attack Bodega Bay in Hitchcock's film (1963), a terrified mother lashes out at the film's avatar of liberated (and threateningly undomesticated) femininity, Melanie Daniels (Tippi Hedren): "I think you're the cause of all this. I think you're evil!" One reading of The Birds would take the birds as a furious feminine multiplicity, attacking domesticity and the family as if in revenge.

The emails are frankly shocking: they seem to indicate that the Board was acting not out of vision, but out of a fear: that MOOCs (Massive Online Open Courses), à la Stanford's Udacity (I know, worst name ever) and MIT's EdX (second worst) were revolutionizing the university, the way that The Jazz Singer made silent film obsolete...according to Singin' in the Rain (1952), although not according to actual film history. In the Board's view, MOOCs were about to make UVa obsolete, and Teresa Sullivan wasn't jumping on the bandwagon with sufficient "strategic dynamism." And where did they get this idea? From an article in the Wall Street Journal and a David Brooks column.

Allow that to sink in for a minute. They took a David Brooks column seriously.

The bitter irony is that UVa is actually packed to the gills with experts in online learning and media—smart people versed in the literature who have actually considered this as a complex pedagogical and research question, who see online learning as an intellectual opportunity and not just a cheesy get-rich-quick scheme. One of the Board's most outspoken critics, Siva Vaidhyanathan, is a media studies professor; UVa is also home to the Institute for Advanced Technology in the Humanities and the Scholars Lab. I'm sure I haven't even scratched the surface with these few examples. But why ask experts, when you can pass around a David Brooks column?

There's a widespread stereotype that academics are anti-business, a stereotype usually framed in a notion of academics as head-in-the-clouds ivory-tower-dwellers and business as "real" ("economic realities," folks! also "excellence"!).*** But that's not really true. What academics almost by their very nature oppose is ignorance and anti-intellectualism.

Given what we know from the emails FOIAed by the Cavalier Daily, the recent actions of the Board of Visitors of the University of Virginia are driven by a profound, thoroughgoing anti-intellectualism, one that rejects expertise as such.

This is far from an isolated incident—just one whose consequences were sudden, drastic, and highly visible.

You may remember a minor uproar in early May when a blogger for the Chronicle of Higher Ed (of all places), Naomi Schaefer Riley, declared Black Studies a worthless discipline because she had read some dissertation titles did not understand them. I called this "anti-intellectualism, déjà vu," because we'd seen it all before. At the time, the world's most ineffectual PR flack, the Chronicle's Amy Lynn Alexander, suggested that readers redirect their outrage toward the defunding of the California State University system, which was happening concurrently.

Gautam Premnath rightly pointed out that anti-intellectualism like Riley's—the idea that a pundit could just dismiss whole disciplines out of hand based on a proud lack of knowledge—was precisely what made such attacks on public education possible.

Here we see it again: same song, different verse. Why consult experts at our own university on matters of substance in which they are expert, when we can listen to a pundit? Why, for that matter, study media history when you can just watch Singin' in the Rain? Why aim for true when plausible is right in front of you?

Why, indeed, go to university at all?

No wonder the Board of Visitors thought the University of Virginia was being made obsolete by TED Talks on the internet. Given what the Cavalier Daily has uncovered, that really is what passes for knowledge with them.

**Enjoy the Mark Liberman classic "David Brooks, cognitive neuroscientist." Liberman is one of the few people who still goes to the trouble of zinging pundits for not knowing what on earth they're talking about. Most of us just accept it as part of life.

***If I could figure out where I laid down my volume of Henry James essays, I would quote the early review of Nana in which James defends against the notion that things that are nasty have some kind of privileged reality-status. I can't seem to find it at the moment, and this is probably going to bother me until I do. Argh.

Saturday, June 9, 2012

So, the sad truth is that I have started a Tumblr. It was time for me to understand Tumblr a little better.

The differences between a blog and a Tumblr, apart from the social/sharing dimension of Tumblr, are subtle, but generally speaking Tumblrs seem to me to be very much about content over personality, and personalities on Tumblr are very self-consciously performed. All of Tumblr is in drag. This is, of course, appealing.

Meanwhile I've been blogging very rarely here, perhaps because I'm still wandering in the wilderness when it comes to the piece I'm working on at the moment, which is and is not about the writings of Stephen Crane.

The Tumblr is a good place to put orts and fragments, and that's where I've recently deposited a few instances of a phrase that strikes me as epitomizing Henry James's style: "beyond everything." He certainly uses the phrase a lot, although, as a quick Google Books ngrams search suggests, James was writing rather at the peak of "beyond everything" (but then again, how much of the GB corpus is simply saturated with James?—And who or what among us is not, rightly, saturated with James?).

Mark Seltzer's Bodies and Machines (1992) is pretty unmistakably a book about literary naturalism, but Seltzer continually has to talk about "naturalism and realism," or at times use one or the other word to refer to both categories. That's fair: nobody quite knows the difference between naturalism and realism, although we're all prepared to say that The Rise of Silas Lapham and The Portrait of a Lady are "realism" and McTeague and Sister Carrie are naturalism, and feel like we know what we mean when we make the distinction. One of the few people to offer a really definitive statement on the matter was Frank Norris, who wrote of naturalism in "Zola as a Romantic Writer" (1886),

This is not romanticism—this drama of the people, working itself out in blood and ordure. It is not realism. It is a school by itself, unique, somber, powerful beyond words.

But what is this distinction exactly, that separates naturalism from both romanticism and realism? When you're reduced to calling a literary genre (made of words, after all) powerful beyond words, you know something is up; we're asked to believe that this is a form of representation that can rise up and shed its status as representation. Instead of words we have "working out" in "blood and ordure." (One thinks of the "lines" on the battlefield in Stephen Crane's fiction, which perform the reverse action—blood and ordure continually admitting themselves to be "words, words, words," as in Michael Fried's classic reading.) For Norris, however, naturalism is not even really about words anymore; it's just beyond—"beyond everything." There's a reason Mark Seltzer's book about naturalism ("and realism") has an entire chapter on Henry James.

To be "beyond everything" is to be at the limit, outside. James is a great user of the unqualified "everything," "everything" as an answer or as a landing place, not "everything else" or "everything I can think of" but the totality, the absolute kitchen-sink inclusion of all that is or can be. "Everything" is so inclusive as to lack meaning; this ambiguity is, as it were, everything to The Wings of the Dove (1902), for example, as we see in an exchange between Kate Croy and Merton Densher (about what a horrible person Kate's father is):

[MD:] "It's so vague that what am I to think but that you may very well be mistaken? What has he done, if no one can name it?"

[KC:] "He has done everything."

"Oh—everything! Everything's nothing."

It's therefore a kind of cop-out to call anything "beyond everything," and yet at times necessary to mark the place where representation fails and we are forced, like Frank Norris, to helplessly and oxymoronically declare ourselves simply beyond.

Nymphs and nuns were certainly separate types, but Mr. Verver, when he really amused himself, let consistency go. The play of vision was at all events so rooted in him that he could receive impressions of sense even while positively thinking. He was positively thinking while Maggie stood there, and it led for him to yet another question—which in its turn led to others still. "Do you regard the condition of hers then that you spoke of a minute ago?"

"The condition—?"

"Why that of having loved so intensely that she's, as you say, 'beyond everything'?"

Maggie had scarcely to reflect—her answer was so prompt. "Oh, no. She's beyond nothing. For she has nothing."

"I see. You must have had things to be beyond them. It's a kind of law of perspective."

Maggie didn't know about the law, but she continued definite. "She's not, for example, beyond help."

"Oh, well then, she shall have all we can give her. I'll write to her," he said, with pleasure."

"Angel!" she answered as she gaily and tenderly looked at him.

True as this might be, however, there was one thing more—he was an angel with a human curiosity. "Has she told you she likes me much?"

"Certainly she has told me—but I won't pamper you. Let it be enough for you it has always been one of my reasons for liking her."

"Then she's indeed not beyond everything," Mr. Verver more or less humorously observed.

—Henry James, The Golden Bowl (1904)

And perhaps the thing that makes us want to distinguish between realism and naturalism is that realism (James) will ask us to meditate on what it could mean to be "beyond everything," and naturalism (Norris) will ask us to imagine for a moment that we are "beyond everything." Naturalism doesn't have that recourse to urbanity and humor that pulls us back and makes us question the very idea of "beyond everything." Here's The Golden Bowl again:

"My idea is this, that when you only love a little you're naturally not jealous—or are only jealous also a little, so that it doesn't matter. But when you love in a deeper and intenser way, then you are, in the same proportion, jealous; your jealousy has intensity and, no doubt, ferocity. When, however, you love in the most abysmal and unutterable way of all—why then you're beyond everything, and nothing can pull you down."

Mr. Verver listened as if he had nothing, on these high lines, to oppose. "And that's the way you love?"

For a minute she failed to speak, but at last she answered: "It wasn't to talk about that. I do feel, however, beyond everything—and as a consequence of that, I daresay," she added with a turn to gaiety, "seem often not to know quite where I am."

"For a minute," Maggie is herself "beyond everything," or at least (as Norris might say) "beyond words"; "she failed to speak." But just as humor rescues Mr. Verver and relieves him of contemplating "beyond everything" in the previous selection, Maggie is able to pull back from the beyond "beyond everything," and can move back into speech "with a turn to gaiety." A certain segment of discourse, the one corresponding to the "beyond everything," is proscribed: "It wasn't to talk about that." But gaiety lets us go.

Nothing doing in naturalism, however: we will embrace oxymoron (words that are powerful beyond words) before we will back off of the "beyond." Naturalism needs something to be "beyond," and it is precisely the realist project of representation that it aims to take as the limit it wishes to overstep.

Even reading short fragments of James makes you start using the word "simply" with unusual frequency as well.

Wednesday, May 23, 2012

It should by now be clear...that naturalist discourse registers such a transformation in production in terms of what I have called the double logic of prosthesis: in terms, at once, of panic and of exhilaration. (160)

There was a moment at which this appeal to "panic" and "exhilaration" (whose?) was a common critical move. But in this, the age of affect theory, one wishes to know just what is meant by such words.

Friday, May 4, 2012

I don't really want to dwell further on the madness that is the Church of Higher Efficiency's* response to Naomi Schaefer Riley's anti-intellectual blog post dismissing all of Black Studies basically on the grounds that Schaefer Riley does not understand the titles of some dissertations. [background]

But I do want to note Amy Alexander's suggestion that cuts at UC and CSU should be the real target of outrage, as if there couldn't ever at any one time be more than one issue deserving of outrage.

As Gautam Premnath rightly pointed out, it's not as though the two issues are unrelated. "The crisis of public higher ed," Gautam observes, "has its roots in the contempt for scholarship you condone."

Schaefer Riley's MO—"check out these titles; aren't they obviously ridiculous? This entire discipline is clearly worthless!" is a very familiar one. Various mainstream publications trot out the ritual mocking of the MLA program every winter, as if a journalist's inability to understand the titles of talks in a specialized field proved something. Eve Kosofsky Sedgwick describes how such tactics were taken up against (the title of a talk from) her work in her 1993 essay "Queer and Now." Anyone who knows scholarship knows that Sedgwick was a true thinker—careful, erudite, inventive, insightful. But you don't have to know anything to mock a title.

We're used to seeing such unrigorous hit jobs in the mainstream press, because the mainstream press is anti-intellectual. Amy Alexander has been defending the Church of Higher Efficiency's dubious decision to give NSR a blog (and, as Brian Leiter points out, this has been dubious for a very long time) on the basis that "CHE is a NewsOrg, not part of Academe." True enough. But if a paper purports to be the Chronicle of Higher Education, shouldn't it have a specialized knowledge of higher ed, or at least not be actively hostile to higher ed? Shouldn't ye olde MLA-season title-snarking be plain out of bounds for any higher-ed-related publication?

My sense is that a lot of academics feel ambivalent about the Church of Higher Efficiency—"it's a dreadful rag, but it's our dreadful rag." CHE is quite adamantlysaying, "no, no, we're not your dreadful rag at all; we have no obligations to higher ed whatsoever." The Church of Higher Efficiency is thus taking the stance on scholarship that Amazon takes on books: you read it; it's a major part of your intellectual and personal life; it contains ideas? Great; whatever; to us it's a widget that we ship out of a warehouse in Tacoma. We are happy to ship you a coffeemaker as well; makes no difference to us. Pageviews, plz.

That's unfortunate, although I can't exactly weep over the Church of Higher Efficiency getting explicit about just how little it cares about higher ed per se. I mean, it's a dreadful rag (exception: the excellent Jen Howard). Rather, I want us all to make the connection that Gautam made, between these routine pot-shots at scholarship by journalists who proudly announce that they are not in a position to know what they are talking about and the kinds of sweeping policy changes that are currently leading to the effective dismantling of public higher ed in California and elsewhere. "Is college WORTH IT?" they ask. Not if you can make a career of announcing your lack of education and taking pot-shots at the educated, under the auspices of a periodical allegedly meant to serve the higher ed community, no less.

Thursday, May 3, 2012

The obligation of the child to be happy is a repaying of what the child owes, of what is due to the parents given what they have given up. The duty of the child is to make the parents happy and to perform this duty happily by being happy or by showing signs of being happy in the right way.

Tuesday, May 1, 2012

It never occurred to me to peruse the USPS's fine selection of stamps online until a friend alerted me (today) to the existence of poet stamps. I'm not much for the fetishization of specific poets, much less the category "Poets." Is there anything worse than "Poets"? But I do love spotting poetry in the wild, and this particular sheet of USPS 45c/Forever stamps offers us a glorious instance—a mini-anthology—a history (forever).

The poets pictured are Elizabeth Bishop, Joseph Brodsky, Gwendolyn Brooks, E. E. Cummings, Robert Hayden, Denise Levertov, Sylvia Plath, Theodore Roethke, Wallace Stevens, and William Carlos Williams—"ten great poets," as the website copy observes—a rather hodgepodge group, but not a bad one by any means. The USPS wishes you to know that this group "includ[es] several who served as United States Poet Laureate," and that, between them, these poets have been awarded "numerous Pulitzer Prizes, National Book Awards, and honorary degrees."

Most remarkably, "[t]he sheet's verso includes an excerpt from one poem by each of the poets featured on the sheet," making this sheet of ten stamps into a tiny anthology of twentieth-century U.S. poetry. What is the principle of selection at work here? It seems somewhat arbitrary, but not way out in left field, either. The apportioning of prizes to the represented poets is undoubtedly one factor, one indicator of notability. The poets are clustered at midcentury, with none particularly recent, and none preceding modernism. They are moderately, but not aggressively, "accessible" poets; another way of saying this is that they are moderately, but not aggressively, "difficult" poets. (Of course Williams was sometimes very aggressive with his difficulty, but that has all been redwheelbarrowed away.)

I wonder what the quoted selections of poetry are. Maybe I'll order the stamps and find out—maybe I'll order the anthology and read it. It costs about as much as a small press poetry book.

"The Twentieth-Century Poets stamps are being issued as Forever stamps in self-adhesive sheets of 20 (2 of each design). Forever stamps are always equal in value to the current First-Class Mail one-ounce rate."

Thursday, April 5, 2012

The inaugural issue of the Journal of Digital Humanities—a PressForward publication—is out today. Dan Cohen and Joan Fragazsy Troyano graciously invited me to guest edit a special section on digital humanities and theory. My introduction to the special section, which has not appeared elsewhere, is here.

I'm excited and intrigued by JDH, as well as cautious. What excites me most about JDH is the postpublication review model that it enacts, insisting on indexing work that has already seen a wide readership. In that sense, JDH is itself a digital humanities project, a fascinating trial version of a program for reshaping scholarly communication that has been much discussed but too rarely attempted.

Editing is always a somewhat heartburn-inducing affair, and I experienced some of that during the process, despite Dan and Joan's continual support, intervention, and hard work. The table of contents of the main body of the issue (the "Articles" section), over which I had no control, was and remains heavily white- and male-authored, a dismaying development that certainly contributed to recent discussions of gender in DH. At that time, through the vagaries of the editorial process, the preliminary table of contents for the special section on theory also came to be very heavily skewed, not only as to crude metrics of authorial identity but also in approaches to what constitutes DH or theory. Considering the terms of the discussion, and the very crucial role that #transformDH had played in it, this seemed maddeningly, frustratingly wrong. (Heartburn ensued.) Postpublication review, then, had its drawbacks, and they had to be painstakingly corrected for.

As it later turned out, DHNow had not been indexing most of the #transformDH group's work—an accident, but a bad one—and the theoretical shorthands for DH that were being rethought in this very special section also acted as filters for what came to "count" as DH work that might be indexed in DHNow. Is the Crunk Feminist Collective an instance of DH, for example? If not, why not? I was so, so pleased when crunk feminist and Emory DiSC fellow Moya Bailey agreed to contribute a piece that spoke to those very boundaries.

Postpublication review is also a hard thing to get used to simply as a humanities scholar. As an editor, I found it a little bit mindblowing to know that the authors' pieces were already accepted—postpublication, you know—and that they were under no obligation to revise according to my suggestions. Several authors did anyway—Alexis Lothian, Jean Bauer, and Patrick Murray-John were particularly thoughtful in this regard. It was wonderful to work with the authors, but, as an editor, slightly terrifying to have committed to this postpublication element.

JDH is a work in progress, and the JDH team are mensches all around. I'm honored to have been a part of the inaugural issue. But a (very recently) past involvement is less interesting to me than seeing where JDH will go from here. Will the problems with skewed demographics subside? (And how?) How will scholars respond to a publication comprising mainly pieces they have already read—comprising those pieces because they have already been widely read? What will the editing conventions ultimately turn out to be for authors contributing to JDH? What culture of open peer review—if any—will grow up around it?

The physical sciences and philosophy are famously hostile to women and people of color. Humanists resent their relatively lower pay and status, but breathe a sigh of relief that they don't have to depend on large corporate grants, and cling fiercely to the intellectual autonomy they enjoy. Humanists are also quietly glad that nothing they make turns a profit, because they're disgusted by the notion that the university would own the patent for it. Science disciplines think the humanists don't do real research. Humanities disciplines think the scientists can't teach for beans and don't care to. Graduate students in the science disciplines resist unionization because they know if they're paid the same as the humanities grad students, they'll be paid less.

These are just a few of the political dimensions of disciplinarity within the university that I can think of off the top of my head.

We tend to think of these as internal squabbles, by, for, and among academics. We tend not to see this disciplinarity as particularly relevant to our undergraduates, who switch majors seemingly at whim and in any case seem to graduate knowing so very little about what we, steeped in it as we are, think of as "the discipline" and "the profession."

But it's increasingly obvious that disciplinarity's politics are not contained among us academics. They are profoundly present in our undergraduates' lives, and guess what? Undergraduates graduate. And their disciplinarity goes with them.

This is not a polemic against disciplinarity, by the way. I believe in disciplinarity: the notion that certain objects of study demand the nuanced development of methods for studying those objects. I believe interdisciplinarity is only meaningful when it truly engages multiple disciplines.

I am only observing that the political dimensions of disciplinarity, which are of course inevitable, are not just an academia thing.

Recently in higher education, there have been a number of flirtations with the idea of charging undergraduates differential tuition based on the income a graduate holding a degree in a given major can expect to command (assuming, laughably, that there is a job available in the aforesaid graduate's field). The University of California, where I did my graduate degree, is one of the schools that has entertained this notion.

I am not aware of any schools that have yet adopted such a tuition plan (which, needless to say, I find revolting). But the idea is always lurking, because undergraduates are continually subjected to anxious/reproachful queries about what their degrees are "worth," or whether college is "worth it." Thus the logic of capital that underwrites the university is made plain: education is a commodity "worth" what it will gain back in economic returns, not the leading-from or drawing-out of the Latin word e-ducare.

(Aaron Bady's excellent post on the subject is worth revisiting. I also highly recommend Historiann's pointed questioning of why it is that college is so often deemed "not worth it" specifically for poor people.)

This is just one example of the way that the politics of disciplinarity plays out at the undergraduate level.

The Daily Cal is the student newspaper at UC Berkeley, one of the major centers of student protest around the systematic dismantling of public education, and where some thirteen protestors have been criminally charged for infractions like blocking the sidewalk and—in the case of Professor Celeste Langan, who walked toward police with wrists held together and said, "arrest me," and who was subsequently dragged to the ground by her hair—"resisting arrest."

In my last few years at Cal, it was obvious that the protests were led and sustained by students in the humanities and social sciences, while students in engineering, preprofessional majors, and the sciences often disdainfully said they had "real work to do" and rejected the idea of protest. Political activism, even on behalf of the university itself, was understood as profoundly extra-academic and extra-disciplinary by those students, while it seemed (and seems) a core value for many humanities students—not just a core personal value but also a core academic value.

The Daily Cal article is about the ASUC (Association of Students of the University of California) senate election. There are two major parties at UC Berkeley, in addition to the usual spate of single-issue and joke parties: Student Action and CalServe. CalServe has historically been an activist, antiracist, relatively liberal and occasionally genuinely leftist party, while Student Action has been more conservative and devoted itself to "student life" issues like promoting athletics and getting film screenings on campus. The article is titled "Student Action senate candidate’s past Facebook posts elicit controversy."

The mapping of national political alignments, local campus political parties, and disciplinarity is made plain by the Facebook posts in question:

In Facebook posts made in November, UC Berkeley freshman Andrew Kooker said, among other things, that “taking the easy way out and doing an easy degree” allows students to “have time to protest something.”

“The American Dream is to be in the 1%; to be ultimately successful in society,” he said in a post. “Granted, you and your liberal arts degree surely won’t yield any results like that.”

I have to take a moment to say: I've taught dozens of students like Andrew Kooker. He's a college freshman, new to college, new to the idea of having a major, who probably doesn't even have a major yet (most declare in the spring of second year). If he's a Berkeley freshman, it also means he's not a transfer student, which means there's a whole world of reality—junior college, working your way through school, really figuring out what you need out of an education—that he doesn't know about. If there's one thing a young person should be allowed, it's leeway to make mistakes, to hold dreadful opinions and then change them, to speak out thoughtlessly and not especially pithily and then have those words vanish. This student is young. I'm not condemning this student; to do so would be unethical. For all we know, or he knows, in two more years he'll be a happy Art Practice major. Or a happy engineering major! There's no holding a freshman to his beliefs about college majors, and as the article notes, Kooker apologized for the statement, saying it reflected a belief about the engineering major that he held when he entered UC Berkeley, but no longer holds.

But this particular student is not my concern, so much as the way that this instance models the political ramifications of disciplinarity. I remember my own undergraduate years, the way a certain physics concentrator would loudly proclaim that all non-physics majors were "bullshit concentrations." And the economics majors—at Chicago, well before 2008? Oh, goodness.

I'm particularly interested in this moment in the article, in which the student election politicizes engineering:

Andrew Albright, the ASUC presidential candidate for CalSERVE– which has historically been Student Action’s primary rival student political party – said it was “shameful” for Student Action to include Kooker on their senate slate.

"Student Action's main base are engineers and Greeks." What is this tiny universe? It's an election in a place where everyone by definition goes to college, which serves as a strange political microcosm. Certainly no one in any national election is going for "the engineer vote." No one's even going for the "academic vote"—such a tiny, pointy-headed minority are we, whose political irrelevance must be proclaimed loudly over and over.

And yet, as we see, the political irrelevance of the academy—and of our arcane, oh-so-internal interdisciplinary squabbles—must constantly be re-announced only because the academic disciplines do ramify. They are matters of policy, of ideology, of politics.

And then, too, they are matters of knowledge—the reason we all got on board in the first place.

[UPDATE 3/26: This piece by Emily Jensen in the McGill Daily likewise reveals the politics of disciplinarity: the McGill Department of English Student Association (DESA) held a town hall that led Jensen to conclude that political activism is a form of "education, happening outside the lecture hall." She writes:

Previous to the town hall, I had a very selfish reason for not fully supporting the strike: I did not wish to forgo the opportunity I currently have to attend the incredible classes I am enrolled in. But I am one of the people for whom the opportunity to attend those classes has never been challenged. I have never been asked to give up that opportunity; to be asked to do so now is a reminder of just how valuable it is. It is perhaps a necessary step to take to attempt to make that opportunity equally accessible to everyone.

However, someone intimated that the strike would have more negative consequences for our GPAs than our government, and it is not ridiculous to suggest that missing class would impact our GPAs. But, quite frankly, my GPA can suck it. My parents are not paying out the nose for a piece of paper that says I have a 4.0. I wouldn’t have chosen McGill if that were the case. In losing the opportunity to inch closer to that 4.0, aren’t I gaining the opportunity to participate in another type of learning? Isn’t engaging in discussion and standing with my fellow students as valuable a learning experience as taking lecture notes? The ultimate goals of the experience outside the classroom may not be as easy to achieve. If they are, then you have still helped yourself take down one barrier to pursuing a degree. This includes the added bonus of having helped more than yourself.

***

UPDATE 4/01 It's come to my attention that this post was linked in the comment thread to the original Daily Cal article, where it was characterized inaccurately. The commenter construes this post as in some way supporting Andrew Kooker's comments or suggesting that they are not objectionable (or, say, wrong), which is of course quite backwards. My point is rather that anti-intellectualism and the politics of disciplinarity are quite a bit bigger than any given college freshman.

By the way, that comment thread is its own little cesspool of horror, where you can see the politics of disciplinarity, and the death of the art of rhetoric, in full swing.]