The Golden Compass is very self-conscious about its relationship to other literature. When you have an epigraph from Milton, you are not playing around. The book is situating itself in a specific, very eminent, masculine literary tradition, even as it tries to intervene in that tradition. Pullman’s hero is Milton by way of Blake; his enemy is C. S. Lewis.

The Golden Compass is obsessed with gender. It can’t help being obsessed with gender given the literary tradition it’s taking on. Pullman is explicit about this, and he specifically sees himself making a feminist intervention. I argue, however, that Pullman’s anti-Lewis intervention is subsumed by the logic of the very tradition he’s interrogating.

The tradition into which Pullman inserts himself is variously Christian; consequently The Golden Compass constructs a complex (and somewhat garbled) theology. Part of this theology involves imagining souls as separable from, and gendered complementarily to, the body.

These ideas are related to some of my recent musings on the relationship between children and animals (1, 2) and on the idea of a child whose soul is separable (e.g. in Pinocchio).

Monday, October 25, 2010

I was recently asked to serve on a panel for graduate students on applying for grants and fellowships. It was pretty fun, and I distributed useful arcana that no one else will tell you. (For instance: the left margin on the department letterhead is 0.65625 inches. The more you know!)

But here's the thing about useful arcana that no one will tell you: it freaks people out when somebody decides that, hey, maybe somebody should tell you--somebody like your peers. Not about the margins on the department letterhead; no one cares about that. I'm talking about two oddly powerful sources of fear and moral panic in our profession: Rate My Professors and the wiki.

I get the impression that in the minds of many academics, these--as much as, if not more than, the budget cuts and casualization of academic labor that are endemic everywhere--pose a threat to the integrity of the university. What we have to fear, many insist, are underinformed undergraduates and underinformed job-seeking junior scholars. In other words, people who are almost completely powerless. The puzzling thing is why anybody would give a crap about either site.

Now, it would be foolhardy to take the contents of either Rate My Professors or the Wiki purely at face value. (This is also true of anything on the internet, on the air, or in print, as trained researchers know perfectly well.) But let's look a little more closely at each of these web sites, both aimed at allowing a relatively underinformed, institutionally marginalized, almost universally young population to share impressions, opinions, and (yes, sometimes unverified) information. What's so threatening about them?

Let's look at Rate My Professors first.

The thing that strikes me about it at first glance is how boring it is. It's not salacious; it's not scandalous; it's not gossipy; it's not even informative. It's boring in the way that most Amazon reviews are boring. Some students post with more or less vitriol or fan-like devotion, but basically it's "yup, this class was pretty cool" or "dude, this prof is boring." Much as a grade on an academic essay tells you next to nothing about its contents or style, a rating tells you almost nothing about what a professor's actual teaching is like. (Also, yes, the chili peppers are sexist, as are many things on the internet.) It's not a useful site. So why does it inspire fear and loathing?

Well, there's its badness. From a social science perspective, it's bad sampling; the students who post on RMP aren't likely to accurately represent the total student population or the students of any given professor. So, okay, professors are being misrepresented. And I get the sense that a lot of people think that the students posting are acting thoughtlessly or maliciously, which may be disproportionately the case given the sample. But if most students were really posting specifically to spread vicious gossip, then you'd think there'd be some funnier stuff on there.

But here's what you really see on RMP: very, very dull, vague evaluations on the basis of criteria that seem to utterly miss the point of taking classes in the first place, given by students who believe that they are performing a public service by making their unincisive opinions known.

What's threatening about that? Well, in it we see the things that really do threaten higher ed: students' apathy; their misapprehension of the academic mission; the degree to which our pedagogical choices seem arbitrary and opaque to students; the degree to which college has been sold to students as either a vocational school or a luxury cruise, or sometimes, oddly, both; the fact that high school graduates whom we have accepted to our institution of "higher learning" almost universally cannot wield a comma worth a damn. That students feel and write this way--or share those writings--isn't the real threat. It's merely a symptom of the far more serious problem: that we as a profession have utterly failed to make the case for learning per se not only to the public but to the very people who are supposed to be engaged in it.

Now you might object that to read RMP as an accurate, and damning, symptom of the structural problems in higher ed is to assume that, if we were really such good professors, students would always be on board with the project of learning and say nice things about us. And that's clearly not true.

But that may be the most unnerving thing of all. For despite the sampling problem, and chili peppers aside, RMP very closely mirrors our own internal teaching assessment tools, the ones we use for hiring, promotions, and teaching awards. The piety that the graduate teaching center at Cal preaches is that student evaluations are a wonderful, accurate assessment tool that we should take very, very seriously. Evaluate early and often, and learn deeply from every one!

But a multitude of studies (just a few of which are cited below) suggest that there's a broad variety of confounding factors that affect teaching evaluations, including gender, race, student expectations of gender norms, the "course effect" (professors teaching unpopular required courses are likely to receive lower evaluations), and grade inflation. And let's not forget the infamous chocolate study--the one that found that evaluations were higher among students who were offered chocolate before the evaluation. The evaluations on Rate My Professor are sadly familiar: the lack of specificity, the contradictions, the complaints that there was homework, the recommendations that the class not be scheduled so early in the morning, or that the prof bring in doughnuts. Let's face it: sometimes our course evaluations are quite useful, and sometimes they really aren't.

Does it mean that evaluations are worthless? Of course not, but they have to be considered with care. Our own internal teaching assessments are not much more methodologically sound than RMP. In other words, you never get an "out" for evaluating your sources, and you certainly can't just rely on a numerical score. It's no more reasonable to unthinkingly trust course evaluations than it is to unthinkingly trust RMP. The need for critical thinking is just a little more obvious with one of them, because it advertises its lack of methodological care with chili peppers and smiley faces.

So is RMP a threat? I think it's nothing compared to the threats that it reveals within the structure of the profession itself. It's depressing that such a significant percentage of students are in an academic institution for non-academic reasons, and it's chilling that one of the principal ways that we assess a central component of our jobs is a relatively inaccurate, little understood tool.

* * *

In any case, the RMP panic is old news. You know what really sends (some) faculty into fits of terror? "The wiki," as it's known. Kind of like "the whale."

Here's the story. Some enterprising junior scholars, some years ago, made a free wiki for posting information on ongoing academic job searches. As lore has it, the idea was to share information among job-seekers, in solidarity in the face of a brutal market with an enormous power imbalance.

The way I often hear faculty talk about the wiki, it's a viper's nest of gossip, misinformation, and drama, something that will poison your mind and destroy your soul. And I guess it could get to be that way if one were to read or check it obsessively. But then, if you're checking the wiki obsessively, you've already got a problem, and it wasn't caused by the wiki. Since many faculty also freely admit that the academic job market itself has the capacity to poison your mind and destroy your soul--and that also goes for those who are on search committees--one suspects that we have another case of shooting the messenger on our hands.

I don't post to the wiki, and I don't read it with any frequency, but I have it bookmarked. Why? Because nobody is more likely to ferret out obscure fellowships than a large group of desperate, underemployed junior scholars. And seeking out fellowships for which to apply is a normal part of every research academic's life, so while doing one's usual fellowship trawl, why wouldn't one check a lengthy, highly inclusive list crowd-sourced by people in a position to care? It's far more comprehensive and up-to-date than the fellowship lists published by the MLA, for instance (sorry, MLA). No one list of fellowships is ever going to suit you perfectly or be highly accurate, so if you're going to have to double check all the information anyway (and believe me: YOU DO), you may as well avail yourself of others' obsessive information-gathering.

Is the wiki unreliable? Of course--like everything on the internet, on the air, and in print. But is it a viper's nest of gossip, misinformation, and drama?

Here's a screen shot of one of the "notes and queries" (heh) sections, for a position at Baker University (Baldwin City, KS):

SCANDAL! There's a ... clarification of the job posting, and a note about the department structure. Oh.

Unlike RMP, this does seem like actual information, which one could check up on if one wanted--probably pretty easily. What's so threatening about the wiki?

Perhaps what's threatening about it is that it isn't a viper's nest of misinformation and rumor--that it holds a mirror up to the irrationalities of the academic labor situation and gives the lie to the notion that there is anything that we could reasonably call a "market" with some kind of regulating invisible hand. Let me be clear: I'm not complaining about my personal situation (I currently have the best postdoc I could possibly want), nor am I claiming that the wiki is a thoroughly accurate source of information. (I mean: it's a wiki.)

But I do believe that, in the aggregate, the wiki is a testament to the lack of transparency and unreasonable burdens in the process, and the large number of junior scholars who suffer for it. I happen to know it's not a walk in the park for the hiring committees, either; it's neither fun nor actually useful to have to evaluate five hundred applications, and the most freakish part of the tight job market is that so many searches actually fail. No suitable candidate, after reading all those files and interviewing all those people, is found. The very existence of the wiki is a symptom of a much bigger problem in academic labor that affects tenured professors as well as ABDs, postdocs, and other junior scholars (and of course the perpetually lamented, rarely actually helped adjuncts). But it's much easier to condemn the wiki and run yet another search, having the already overworked faculty stay up nights exhaustedly skimming five hundred abjectly well crafted cover letters, than it is to try to effect a systematic change in the academic labor situation--a task so monumental as to be nearly unthinkable.

I'm not saying anything new or even, I think, controversial about the structural problems in higher ed. The job "market" barely functions and exacts a lot of collateral damage, and undergraduate teaching faces an significant crisis of legitimacy because the public is neither clear on what learning is nor convinced that undergraduates actually ought to do it. What the moral panic around the Websites of the Frustrated (that's my new umbrella term for RMP and the wiki) tells us is that we'll do anything to avoid addressing either problem at its root.

Friday, October 22, 2010

A smattering of books on girls, presented in order by date of publication:

The Mental Flower Garden, or, An Instructive and Entertaining Companion for the Fair Sex In Two Parts: To Which Are Added, Intersecting Sketches of Female Biography. New-York: Printed by Southwick & Hardcast, 1807. Print.

Newcomb, Harvey. How to Be a Lady: A Book for Girls, Containing Useful Hints on the Formation of Character. 8th ed. Boston: Gould, Kendall, and Lincoln, 1850. Print.

Benjamin, R. C. O. Don't: A Book for Girls. San Francisco: Valleau & Peterson, Book and Job Printers, 1891. Print.

1. If you find yourself declaring, "mine is the only subfield in which creative, interesting work is taking place, and anyone who is not interested in my subfield is stupid and bound for obsolescence," you might want to stop and reflect.

2. If it is a frequent refrain in your subfield that the aforementioned subfield is the only one in which creative, interesting work is taking place, and that anyone who is not interested in that subfield is stupid and bound for obsolescence, you might want to stop and reflect.

Wednesday, October 20, 2010

Next semester I'm teaching a course on early twentieth-century American poetry:

Didactic Modernism: American Poetry, 1915-1945

This course will provide an overview of American modernist poetry, addressing key concepts in modernism including impersonality, the crisis of representation, and abstraction. Among these, however, the course will take as its primary area of investigation modernist American poetry’s manifold attempts to refashion the way people read, casting readers as pupils requiring instruction. This course explores the ways in which modernist poets construed literary change as demanding a return to “the basics”: a revision of the literary canon, new demands on the reader’s education and attention, and a reconsideration of what it means to read--or to learn to read. We will approach this topic from two critical angles: first, by way of theories of language advanced by Saussure, Austin, Derrida, and Wittgenstein in a philosophical return to the basics; second, through a consideration of the history of pedagogy and childhood in America. This course will prefer lingering over longer bodies of work to reading single poems by many authors; as always, coverage cannot be comprehensive. Students will write two short papers and take a final exam.

I've already had a few people email me about the wait-list. In case anyone reading this is thinking of emailing me with the same: it doesn't matter what you registration status or class standing is; unless you're actually at this moment registered, I have no idea whether you'll get in. English majors and students with higher class standing will be given priority, but I won't be the one making those calls; it will be the arcane Tele-Bears system that does it. I also have no idea how many registered students will drop; that's ESP territory, and I'm not psychic.

To clarify: AT THIS POINT, I HAVE NO CONTROL OVER OR SPECIAL INSIGHT INTO THE REGISTRATION PROCESS.

I sympathize with students' frustration with the uncertainties of the registration process. This is why it matters so much that students make themselves aware of the administrative policies at their own universities, and, at public institutions like Cal, aware of state politics. This is your university, if you care to help shape it.

Tuesday, October 19, 2010

In Nest, Mei-mei Berssenbrugge invokes the horrors of cuteness in a poem titled "Dressing Up Our Pets," a poem that, as I posted in August, reminded me of the inappropriate weirdness of a Regretsy post.

Now my sister Maria has alerted me to what I can only describe as a similarly horrifying crime against taste, and probably against animals in some states, in the name of cuteness: a photo gallery of pets dressed up in Halloween costumes, courtesy of the Boston Globe.

The screen shot below of "Madison Yee in a bee Costume" [sic] would seem to show an animal costumed as a different animal. But since it's a "Halloween costume," and dressing up for Halloween is specifically a children's tradition, it's actually a dog costumed as a child costumed as a bee.

I can only feel sorry for this animal. It had no choice in the matter.

What motivates the decision to dress a pet up like a child, or to dress a child up like a pet?

In A Christmas Story Ralphie is forced to wear an atrocious bunny suit sent by his aunt; his mother forces him to put it on and then, when he does, can't suppress a laugh as she exclaims, "Isn't that cute!" (His brother just laughs openly.)*

Sianne Ngai argues that the cute creature is defined partly by its unthreatening aggression. Cuteness has a "capacity to convert a subject's veiled or latent aggression toward a vulnerable object"--like a child or a pet--"into an explicit aggression that seems to be directed toward the subject" (828). Like Ralphie miserable in his animal suit, like the animals miserable in their child suits, the cute creature is on the receiving end of aggression and is visibly dissatisified, but can't do anything about it.

In the abstract that makes plenty of sense. But it seems like there's more going on here, because of the special symbolic connection between children and animals. Haraway points out that we are particularly prone to speaking of pets as if they were "furry children." To dress a child as an animal is an act of aggression that renders the child particularly cute.

Why is it that dressing a child up as an animal, and apparently vice-versa, according to the Boston Globe, constitutes a privileged special case of cuteness?

Perhaps it is precisely the symbolic closeness between animals and children that makes it so very aggressive and uncomfortable (therefore cute) to persuade (or force) one to masquerade as the other. This is the sort of thing that makes me think we need to think much more carefully about the relationship between animals and children.

Image by Yoshitomo Nara

*The humiliation of being dressed up in something awful is, oddly, not associated with being identified with an animal in the film, but rather with the ultimate humiliation: being "perpetually four years old [and] also a girl." It's not that the costume is appropriate for a girl or that a girl could look dignified in it (who could?) but that the indignity would somehow make sense for a girl. It is the father who protects Ralphie's masculinity by urging him to take the costume off. (Uh, and gives him a gun that can really shoot. But I digress.)

It's been observed before (I made the link myself recently) that in dictating the Autobiography, Mark Twain was essentially blogging. A recent CBS story on the release of the Autobiography quotes Bob Hirst putting it like this:

"Mark Twain wants this autobiography to be random," Hirst said. "You know, he's going to talk about what he wants to talk about on this day, change his mind and move onto the next thing."

You heard that right . . . talk. One of the greatest writers in American history decided the best way to tell his own story was NOT to write it, but SPEAK it.

Daily dictations over four years, about whatever he found interesting that day.

So was Mark Twain the first BLOGGER?

"I would say that is exactly right," Hirst said. "Partly a journal, partly a diary, and partly recollection. So yeah, I think of it as a kind of blog, a blog without a web!"

The thing about blogs, though, is that whether or not they are particularly plugged into the Zeitgeist, they're timely by virtue of the way they're parceled out in time. A post today, a post tomorrow. But the Autobiography's dailiness actually isn't so clear, for a number of reasons.

This was a guy, let me just say, whose sense of timeliness was very different from the rationalized, homogeneous empty time of the RSS feed. As you'll find out if you read the Autobiography (and I'm pretty sure I've blogged about this before, it's so incredible), Twain thought it would be brilliant to have a periodical devoted entirely to old newspaper articles. It would be called The Back Number, and it would publish an assortment of news articles of yore without comment or context. In a way that's just what the Autobiography is like as well. Newspapers to Twain aren't "one-day best-sellers," as Benedict Anderson cleverly put it; they're more like flies in amber--interesting, enduring, a little gross. How do we square that kind of mentality with the logic of blogs?

Let's put aside the fact that the Autobiography is over a hundred years old (the dictations in Volume I, the only volume that's out, are mostly from 1906, I believe). Quite often one day's dictation will leave off and pick up immediately the next day. Sometimes Twain spends five days' dictation telling one story, and the dates are no more than interruptions. This obscures the sense of parceling out that we get from blogs.

Moreover, these dictations are mostly reminiscences, progressing day by day but alluding to different points in time. Like a blogger, Twain is talking about whatever he feels like talking about on that day, but because he's also recounting his life (in a very haphazard way), the day-by-day progression of his dictations butts up against the scrambled chronology of their contents.

And finally, Twain's dictations aren't actually always produced day by day. For one thing, he does edit, introducing the recursivity always implied by editing. His stenographer, Josephine Hobby, would make a typescript, which Twain would then edit and Miss Hobby would re-type. Often this process happened twice. And the dictation itself? Well, it wasn't always dictation. Sometimes he would instruct Miss Hobby to insert an old newspaper article, or a letter. And not infrequently, he would instruct her to insert an old piece of writing. For instance, most of "In Memory of Olivia Susan Clemens. 1872-1896," a piece written in memory of his daughter Susy not long after her death, was inserted into the February 2, 1906 dictation. Which is to say that he did not compose the 2/2/1906 dictation on 2/2/1906 at all, but rather decided it was the right point in his writerly timeline to introduce an older piece.

I'm sure this is noted in the explanatory notes (I don't have the volume on me at the moment; it's in my office, being too heavy to schlep around casually), but you wouldn't necessarily know it from reading. Sometimes Twain points out when he is quoting himself, usually when presenting and commenting on funny set-pieces. Sometimes he doesn't. In other words, I think the blog comparison makes sense for the book as published, but it breaks down in the archive.

The temporality of blogs is complicated, but the temporality of Twain's Autobiography is more complicated still.

Friday, October 15, 2010

The tag-line for the Gregory Brothers' Autotune the News series is "everything sounds better."

I'm fascinated by the internet principle that there is almost nothing that cannot be improved by a dance beat. The Gregory Brothers (a misnomer--they are actually the Gregory brothers and sister-in-law) capture the highly mediated, corporatized medium of television news and mediate the hell out of it, editing, remixing, inserting themselves into the videos, and of course turning media and political celebrities into American Idol-like singing contestants. (Each Autotune the News video ends with awarding the unwitting participants "best accidental singer" rankings.)

In reappropriating the news, the Gregory brothers trivialize it, but perhaps not more than it has already been trivialized by the news media itself. There's a way in which Autotune the News performs the same work as the classic spoof trailer of The Shining, which pitches the horror film as a goofy family comedy:

It reveals the extent to which music and framing tell us what to think about the film we're watching. But Autotune the News doesn't just reframe; it also insists on a particular kind of reframing through music--patently derivative but joyous nonetheless. The "serious" news--where seriousness has been produced by grave tones and facial expressions, music, and infographics--becomes an occasion for catchy pop play.

This ethos seems to find the ultimate expression in the "Bed Intruder Song." The song is based on real news footage--a moment of the same kind of outbreak of human passion that Autotune the News celebrates. Regular guy Antoine Dodson delivered a brilliantly enraged rant on the news in response to a terrifying home invasion and the attempted rape of his sister Kelly. Breaking out of the dampening mediation of the interview format, Dodson addressed the public and the fugitive attempted rapist directly, expressing his righteous rage and heaping indignant insults on the criminal. The Gregory Brothers autotuned and edited the video to create the Bed Intruder Song:

Tellingly, Dodson was surfing Facebook at the time of the invasion, and later took to the Bed Intruder Song with glee ("it's my ringtone!", he acknowledges with some joy). Dodson himself is a part of the internet culture that believes that "everything sounds better" autotuned and set to a dance beat. And why not?

In the only logical turn of events, Dodson later performed the Bed Intruder Song live (but still autotuned!) at the 2010 BET Hip Hop Awards, with a Gregory backing him up on keyboard:

Even Carl Sagan sounds better with a beat, as one obsessive musician has demonstrated:

Perhaps naïvely, I see the "everything sounds better" ethos as a resistance to the corporatization of art and media and the utilitarianism of the information age. Students are increasingly trained vocationally; they're made to feel guilty for taking courses that don't feed into their pre-med requirements, or for (heaven forbid) majoring in music or theater or German (or French, Italian, Russian, Classics, or theater, the programs being cut at SUNY Albany, or philosophy, which is on the chopping block at, of all places, Howard University). Making art and engaging in critique, they're told, must be left to those who can make a living at it, and the people who can make a living at it are those who work for the major media corporations.

But people can't live without making art and engaging in critique; more to the point, they don't want to. Ordinary citizens who aren't going to win American Idol (one of the Gregory Brothers--sadly I haven't had time to research this in any detail--was briefly a contestant on one of these shows, I believe), at a time when everybody knows that breaking into sanctioned corporate art depends on luck and popularity with focus groups, rebel by reappropriating corporate media, often flirting with copyright violation in the process. Corporate media becomes everybody's media; CNN cannot do better than I can do, because "everything sounds better"--and a little funny--with a dance beat. Supposedly utilitarian information, routed through the infamous 24-hour news cycle, is revealed as entertainment--entertainment that now "sounds better." Sometimes it's creative and original (I think the Carl Sagan song definitely qualifies as original) and sometimes it isn't. But it's pop culture, and it's ours.

Wednesday, October 13, 2010

A little linkspam for today, courtesy of Twitter.

1. Here's a hilarious video by Ron Charles on "Booklette," a new online tool that, takes "the flesh-crawling weirdness of Chat Roulette and combine[s] it with the total uselessness of crowdsourced reviewing."

Links don't last. The "Disappearing Act" authors found that "49.3 percent of the original 2,011 cited resources could not be located at the cited URL," according to the paper's abstract. "The older the article, the more likely that URL's in the reference list of that article were inactive."

As part of her study, Ms. Wagner did a literature review of about 95 other link-rot studies across all disciplines, including a few in the humanities. (If you know of humanities-related work on link rot, please let me hear about it.) The universal conclusion: Too often "the stuff was just not there anymore," she told me. "It is a problem that affects all fields."

College professors take a lot of heat from the general public, and we deserve much of what we get; and humanities professors get the worst of it. And arguably, English professors the worst of that: we represent, apparently, the absolute nadir of contemporary culture.

I said that to some degree we deserve it; what I did not say, you’ll notice, is that it’s true. Untrue, but we deserve it? Well, yes: I think that college professors as a group, and English professors as a high-visibility (and high-risibility) subset, have done a terrible job of explaining just what it is that we do, and actively countering the most pernicious caricatures of our work that circulate in the larger culture.

One of the received ideas about profs, of course, is that we’re incurably self-absorbed. It’s hardly convincing for me to argue that I’m not self-absorbed, of course; here I am, sitting at my laptop, writing on my blog, and if I were a world-class narcissist, presumably I’d be the last to know. But I can tell you about my colleagues (by which I mean not just those with whom I work at Pomona, but my professional colleagues nationally and internationally); they sometimes disappoint, but far more often, I’m stunned by their generosity.

I've been meaning to put it out there that I've been doing some reading around in animal studies recently. I feel very ambivalent about this, in part because animal studies always seems to me to have the potential to reveal itself as sentimental "I love my dog!" BS. I still find the cyborg wave of posthumanist studies more compelling.

I think one day I'd like to undertake a serious study of the symbolic-discursive relationship between animals and children. We sometimes speak of children as if they were little adults, or as if they were the colonized (Nodelman). The latter is particularly troubling to me when we consider that there are people who are both children and colonized, a fact that the analogy between children "in general" and colonized peoples tends to obscure.

The real analogy that pervades our literature is between children and animals. Think of Curious George and Stuart Little, the child-animals--even the boy in The Witches who is quite content to have turned into a mouse. Animals and children are the two paradigmatic cases for studying cuteness. The question of language acquisition (and whether it is possible in the case of animals, and whether children who never have it are therefore animals) is likewise a central connection. I think we should take the comparison between children and animals seriously. We need a better philosophy of childhood, and the philosophy of animals could be illuminating in that regard.

Monday, October 11, 2010

I went to THATCamp Bay Area this past weekend. Roy Tennant from OCLC blogged about it briefly. I learned some new things, met some cool people. I'm exhausted, though, I have to admit!

I'm still puzzling over why digital humanities folks are so obsessed with space. I'd say at least 40% of the sessions were about space and place. But surely the temporalities of digital media are at least as interesting--more, in my opinion. I suspect much of the interest has to do with the availability of existing digital tools (Google Maps, etc.) for making stuff, often pretty pictures. Time is almost necessarily distorted as soon as it's visualized, and in any case, I'm not sure that there's anything we're really want to make with time data. This blog post is going to have a time stamp -- a marker, to the minute, of when this post appeared on the public web. It won't tell you when I wrote it (Sunday afternoon, right after the last session? this morning, after mulling it over?), or how long it took me, or whether or how much I revised. Nearly everything on the web is time-stamped, usually to the minute, quite often in GMT. The web's apparent homogenization--and punctualization--of time belies the multiple synchronous, asynchronous, proleptic (how far in advance to queue a post?), stream-of-not-exactly-consciousness modes that operate in and around it. We've never had many good ways of talking about time, but somehow this philosophical difficulty is exaggerated when it comes to digital humanities.

I often find digital humanities as frustrating as it is productive; "narrative" means different things to a programmer and to me. I keep having those "you keep using that word" moments. It's not a bad thing. It's a good reminder that we need to return to basics sometimes and let people outside our fields know about our basic concepts and vocabulary. In one conversation that turned rather freewheeling, the issue of attention on the internet came up, and no one else at the table was aware of the discourses of attention that pervaded my own period of specialty (according to Jonathan Crary, the obsession with attention starts around 1870, and you can see it running through Benjamin, among others). It's absolutely relevant to DH discussions, but it hadn't occurred to me before to talk about it at THATCamp.

It strikes me that all of the "bootcamp" sessions (where someone instructs the group in "real skills") were tech-oriented. If I go to another THATCamp, I will definitely propose a humanistic bootcamp involving some philosophy of history and an intro to some relevant literary and historical concepts, or maybe a Foucault or a Bakhtin bootcamp. Programmers can be taught these things, after all!

Thursday, October 7, 2010

I wasn't able to blog about today's protests at UC Berkeley, in part because the fire alarm in Wheeler Hall went off twice, resulting in my temporary decampment to area cafés. The Daily Cal blogged and tweeted the day's events:

Students, faculty and staff across the nation are gathering on Oct. 7 in defense of public education. After California faced a $24 billion budget gap last year and cut $637 million in funding to the UC system, the university took steps to fill its own funding hole, implementing a furlough program and a 32 percent fee increase for students.

Campuses across the system made their own cuts, the impacts of which are still being felt this year. And with the announcement that UC Berkeley will cut some 200 positions in January, a freeze in faculty hiring, a proposal to develop online courses at the UC, a rise in out-of-state student enrollment and the elimination of four intercollegiate athletic teams this year, protesters say they still have much to rally about in regards to the changing nature of the university and public education as a whole.

Speaking of public education as a whole, let's not forget about SUNY Albany's decision to phase out Italian, French, Russian, Classics, and Theater, which Roland recently blogged about.

On Tuesday there will be a public forum on law school dean Chris Edley's proposal for a UC cybercampus, a proposal that, when I first heard it last year, I fully thought was a parody or a joke. Evidently, not only was Edley serious, but UC admin is taking his suggestion seriously.

That's not to say that there's no such thing as a good online course, or a good course with online components. But developing and maintaining such a thing is difficult and time-consuming, and depends on serious pedagogical and intellectual decisions. There are people out there thinking online education through in very interesting ways (Cathy Davidson, most prominently, but also Kiri Miller, among others)--it's called research. If only Dean Edley had heard of it. There's so much wrong with this whole online campus idea that I'm going to stop writing now to prevent my blood pressure from spiking.

It's about 6:30 and the sun is setting. Now that there are no fire alarms going off in Wheeler Hall, I can hear the sound of either students or hippies, or possibly both, out drumming in Sproul Plaza. Doe Library is still occupied.

Wednesday, October 6, 2010

Here's an update on the fortune cookie question, since apparently in addition to devoting myself to the three pillars of research, teaching, and service, I'm also to be the Dear Abby of MLA citation practices. Or maybe the April Winchell of MLA citation practices.

Arrick Underhill writes in to ask:

so, if I want to quote a fortune cookie and put it in my Works Cited, only later discovering that according to Google the quote originated with George Bernard Shaw rather than Ancient Chinese Wisdom, is it acceptable for me to continue with my plan to cite the fortune cookie? Or am I duty-bound by the standards of academic conduct to remedy the plagiarism of others, which seems to be the result of a time warp in which George Bernard Shaw actually made contact with Ancient Chinese Civilization and passed down his quote, in English, which reached me here in the 21st century. Or it might have been the 20th. I can't remember.

Thoughts?

To be honest, the only reason I can imagine for citing a fortune cookie in the first place is to perform some kind of pomo hipster DFW-wannabe crap. In that case, the point is not actually to cite anything, but to parody the practice of citation by way of a crispy take-out treat. In that case, one should take to heart the MLA Handbook's directive to use your wits and adapt the style as necessary to the situation, e.g.:

"Yes, you squashed cabbage leaf, you disgrace to the noble architecture of these columns, you incarnate insult to the English Language: I could pass you off as the Queen of Sheba." Fortune cookie. Berkeley, CA: Shen Hua, n.d. Eaten 6 October 2010.

Indent appropriately and alphabetize under Y, secure in the knowledge that Susanna Clarke entirely pwned you as early as 2004.

Okay, now that that's out of the way, let's get serious. It's scholarship time, friends. Citation is about directing readers to your sources, and the truth is that readers are unlikely to reproduce your fortune cookie experience. Forget the cookie and cite Shaw. There should be a parenthetical citation within the main text looking like this:

Does it sound like I'm squishing fun? Far from it. Scholarship affords something far more fun than the pale pleasures of parody: endnotes. (MLA style calls for endnotes, not footnotes!) This situation is ripe for a lengthy digression on how you arrived at the quotation, the appropriateness of a fake Chinese saying appearing in a fake Chinese dessert, and the Western desire to produce identity through a projection onto a mythic Orient. Ideally the endnote will cite Said and Auerbach, and finish with a lengthy discussion of monstrosity, and The Wonders of the East, and the checkered history of Cotton Vitellius A.xv.

Technically speaking, MLA style frowns on lengthy notes. But scholars love them for the freedom and joy in research that they express. This is what you get when you try to cite a fortune cookie.

In this lecture, Professor Ian Hacking will explore how our innate sense of symmetry has enabled us to probe the most hidden secrets of nature and also to get along with each other.
About Ian Hacking

A distinguished philosopher, Ian Hacking combines attention to anecdotal details about our experiences with very general conceptions of the place of human beings in the world. He likes to think of himself as a philosophical anthropologist. In this lecture he will present a new development in his philosophy, one which remains in the spirit of what has established his reputation as a "Philosopher of the Particular Case."

His early work, represented by The Emergence of Probability (1975) and The Taming of Chance (1990) brought a new understanding of how statistics changed the world and how we think about it, from sociology to physics, not omitting sports and our sex lives. His Representing and Intervening (1983) returned philosophers of science to their roots, namely experimental science. It began what he calls a "back to Francis Bacon movement," which has changed the history, philosophy, and sociology of the sciences.

Monday, October 4, 2010

Lefebvre gets a little cranky about people flinging around the word "production":

There is a point beyond which reliance on such formulas as 'the production of knowledge' leads onto very treacherous ground: knowledge may be conceived of on the model of industrial production, with the result that the existin division of labor and use of machines, especially cybernetic machines, is uncritically accepted; alternatively, the concpet of production as well as the concept of knowledge may be deprived of all specific content, and this from the point of view of the 'object' as well as from that of the 'subject' -- which is to give carte blanche to wild speculation and pure irrationalism. (72-3)

Sunday, October 3, 2010

I was pretty upset to hear that SUNY Albany is cutting its programs in French, Italian, Russian, Classics, and Theater. Is this the future of public universities?

According to Brett Bowles, graduate chair of French Studies, the reason these programs were singled out was that there was a low student-to-faculty ratio in them (something usually considered a good thing, but apparently too great a luxury now!). And you have to figure that, sure, there can't be that many Russian majors at any given school. But even leaving aside the bleak vision of a world without Russian majors, what happens when you phase out the department altogether, as the school seems to be trying to do? Is SUNY Albany really comfortable more or less ensuring that their engineering majors, their English majors (!!), and their biology majors haven't taken French, Italian, Russian, Latin, ancient Greek, or theater? Like, ever? What kinds of graduates are they planning on turning out, exactly?

Saturday, October 2, 2010

It looks like the Holloway web site isn't being updated, but the series is definitely still going, so for all you googlers out there, here's the rest of the fall lineup:

Tuesday, October 19 at 6:30: Rachel Zolf

RACHEL ZOLF’S poetic practice explores interrelated materialist questions concerning memory, history, knowledge, subjectivity and the conceptual limits of language and meaning. She is particularly interested in how ethics flounders on the shoals of the political. Her fourth full-length book, Neighbour Procedure, was released by Coach House Books in 2010. Previous collections include Human Resources (Coach House), which won the 2008 Trillium Book Award for Poetry, Masque (The Mercury Press), Shoot & Weep (Nomados), from Human Resources (Belladonna books) and Her absence, this wanderer (BuschekBooks). Born in Canada, she lives in Brooklyn.

Thursday, November 4 at 6:30: Ken Irby

Kenneth Irby was born in Bowie, Texas, and grew up in Fort Scott, Kansas. He is a graduate of the University of Kansas, Harvard University, and the University of California, Berkeley. He has variously lived, worked, served in the Army, and taught in New Mexico, Nevada, the North Pacific, California, Massachusetts, New York, Colorado, and Denmark, and currently lives in Lawrence, Kansas, teaching in the Department of English at the University of Kansas. His recent book is The Intent On: Collected Poems 1962-2006 (North Atlantic Books, 2009). In 2010 he received the Shelley Memorial Award from the Poetry Society of America, sharing that with Eileen Myles.

Friday, October 1, 2010

I don't have time for a proper write-up of Harryette Mullen's Mixed Blood talk and Holloway reading last night, but here are some brief impressions of the reading. She's been at work on a family history project that has led her away from poetry (the subject of her Mixed Blood talk earlier in the day), but in the meantime she's been practicing a kind of poetic discipline by writing a tanka a day. The reading was of these new daily tankas. Some were better than others, but they seemed to me to really get at a Japanese aesthetic in their brevity, their observant quality, and their ability to register differences in similarity -- moments that acknowledge differences between animals and humans, between humans and their environment, between people, and yet at the same time make of that difference a commonality or even identification. The poems' dailiness shows--sometimes they comment on recent events in the news--and they are very much Los Angeles poems, for the most part. They were different from the writing by Harryette Mullen that I'm used to, but there was still that element of wit and humor that characterizes her other writing. I also thought Seulghee Lee's introduction was smart and useful.