Thursday, December 31, 2009

Beginnings and ends of years are only a convention, of course. The Gregorian New Year we're about to observe is particularly weird, it seems to me, though I suppose you could make a case that the roaming Chinese lunar new year (January 26 in 2009, February 14 -- hey, that'll be Valentine's day! -- in 2010) is more so.

In any case, it's true that you might as well take stock of your life, or your year, at any point in the year, but January 1 is good enough for me, not least because it's also my birthday, another one of those stock-taking days.

Jon Swift is no longer blogging, and one of the lesser losses as a result of his retirement is that he's no longer inviting other bloggers to nominate their own best posts of the year. Like Avedon at The Sideshow, then, I'll blow my own horn here. My favorite post of 2009, the one I most wish everyone would read, is "Dude, I'm a Fag," my discussion of the cost of gay respectability and assimilation. ("Assimilation" is a dodgy word when applied to gay people, I know, and if I made New Year's resolutions, one would be to follow up on that statement soon.) It got more attention in terms of links and readers than any other single piece I've written for this blog, and I'm proud of it. (Along with its followup.)

I sometimes feel -- not guilty, exactly, but I'm not sure what the right word would be -- I feel strange that I live in a city where unemployment is much lower than the national rate, and that I have a stable job with benefits, when so many people don't. During the break, when I've been out and about a lot, I've become aware of how many of the people who hang out around the public library and the bus station are homeless. Yesterday afternoon, for example, I saw a man with the telltale overstuffed backpack and heavy clothing asking a couple, "Do you have a place to stay tonight?" The temperature dropped from a not-too-bad 37 to this morning's 10 degrees F. Another resolution-if-I-made-one is going to be to start making some regular donations to the local food bank and free kitchens.

Anyone who's read this blog will know that I haven't been disappointed by President Obama's performance: my predictions from before he took office have been borne out abundantly. The only thing that disappoints me, just a little, is my own inability to do the "I Told You So" dance at his supporters. But on the whole, I suppose that shows me to be a better person than the Obama fans, who've behaved just as badly since he took office as they did before. It's significant, I think, that this blogger at The Nation, listing "some under-appreciated progressive victories that should inspire hope for 2010", doesn't mention Obama once. Which is the best attitude to take, I think -- as I've pointed out before, the real victories aren't the occasional crumbs the rulers throw us, it's the ones we take for ourselves. Time to abandon Big Man worship in favor of mass action. If I can just find the right mass to join...

And that reminds me, I finally found one of the articles I've been looking for about the effectiveness of Democratic abusiveness to voters on their left. This is an interesting interview from Counterpunch in 2003. Obviously, Sam Smith's question was never answered, and unfortunately, given Obama's victory in 2008, it's unlikely that the lesson was ever learned:

What I tell my Democratic friends is that if they want my vote they have to treat me at least as nice as a soccer mom or one of their corporate campaign contributors. How come, I ask, Greens are the only constituency in history that you think you can convince by hectoring them? What are you going to do for my vote? I ask. And they look at me perplexed.

Another good line from the same interview: "Polls are the standardized test used by the media to determine how well we have learned what it has taught us."

Wednesday, December 30, 2009

It's now a year since the beginning of Operation Cast Lead, the Israeli blitzkrieg of Gaza. Uri Avnery has a good retrospective on it at Counterpunch, showing that it was at best an ambiguous victory for Israel. The siege of Gaza goes on, however, with the US Army Corps of Engineers working with Israel to maintain it. A large protest, the Gaza Freedom March, is simmering in Cairo, to the great discomfort of the Egyptian government; the participation of Hedy Epstein, an 85-year-old Holocaust survivor from St. Louis, adds to the bad PR for both Israel and Egypt.

But I want to write about a startling post on the Israel-Palestine conflict by Stephen Vizinczey on his blog. It's hard to believe this was actually written by Vizinczey, who wrote some insightful analysis of Soviet and American imperialism in the past. But, it seems, like many people his insight breaks down where Israel is concerned. He wrote:

The recent condemnation of Israel is a fine example of the selective outrage that ensures that there will never be peace on Earth. “We rain rockets on the Israelis, on their farms, on their nurseries, their schools. We explode bombs in their restaurants, in their supermarkets – we are going to wipe Israel off the face of the Earth! But they are so evil, they want to live, they get angry, they hit back, they try to defend themselves! Condemn them!”

This makes no sense on any level. To begin at the surface, who is supposed to be speaking in his imagined quotation? Since it purports to be a "we" who "rain rockets on the Israelis," it would have to be Hamas, or since Hamas stopped firing rockets into Israel, various small independent Palestinian factions. (The bit about wiping Israel off the face of the Earth is a giveaway too.) According to Avnery, though, "The Qassams have stopped almost completely. Hamas has even imposed its authority on the small, extreme factions, which wanted to continue." Of course such groups would condemn Israel, and there's nothing "selective" about it. (Does Vizinczey expect the Palestinians to "rain rockets" on North Korea? The US?)

But Vizinczey isn't just angry at the Gazans; it seems that he's blaming everyone in the world who condemned Operation Cast Lead, none of whom rained rockets on the Israelis. Perhaps he means that Archbishop Tutu, Israeli combat veterans, Jimmy Carter (though he has recently recanted, or at least backtracked), and Roger Waters of Pink Floyd all figuratively rained rockets on Israel, or figuratively cheered on the rockets. It would have to be figuratively, or literally in the sense of figuratively, because virtually all the critics of the attack on Gaza were scrupulous in condemning rocket attacks on Israel.

As Vizinczey must know, the reason for the widespread condemnation of Operation Cast Lead, as of the 2006 Israeli invasion of Lebanon, was not that the Israelis were defending themselves, but that they were going so far beyond defense. Israeli casualties during Operation Cast Lead were one percent of Palestinian casualties; a similar disproportion has characterized most Israeli violence against the Palestinians. Indeed, many of the charges made against Hizbollah in Lebanon and against Hamas in Gaza turn out to be true of Israel instead: breaking ceasefires, for example, or using civilians as human shields.

The Stephen Vizinczey who dissected American justifications of its massive violence in Vietnam would have recognized this. The Stephen Vizinczey who jeered at white attempts to demonize slave resistance would have recognized this. The Stephen Vizinczey who fought in the 1956 Hungarian uprising against the Soviet Union would have recognized this. But today's Stephen Vizinczey joins the Israelis raining missiles on Lebanon; no doubt he would have joined the schoolgirls who signed their names on missiles waiting to be fired. This Stephen Vizinczey joins the Hasidim placidly watching the smoke rising above Gaza from a nearby hilltop. The Palestinians are so evil, they want to live, they get angry, they hit back, they try to defend themselves! Condemn them! Yes, there is selective outrage here, but it isn't the critics of Israel who are guilty of it.

Tuesday, December 29, 2009

I'm just too lazy today to write any more, but I liked this photo I found on Blogger Play. It reminds me of winter days when I was a kid. I didn't grow up on a hog farm, but there was one down the road from us. Oddly, the blogger called this much snow a blizzard; I don't think school would have been canceled for it in northern Indiana.

While I'm doing the photo thing, here's one of mine, taken while snow was falling this past Sunday. I was able to get the falling flakes to show up. We didn't get a lot of snow, just half an inch to an inch altogether, and it's almost gone now since the sun came out today.

Monday, December 28, 2009

In the past few years I've been making an effort to revisit books I read as a kid, especially by the science fiction writers who first got me interested in the genre: Robert Heinlein, Andre Norton, Ray Bradbury. Anthologies, especially Groff Conklin's, were a big help in exploring and discovering new writers, and Conklin's Invaders of Earth stood out for me, as it did for many people I think, because it included Edgar Pangborn's first science fiction story, "Angel's Egg."

"Angel's Egg" is the story of an old man who finds an egg-like object that hatches a tiny extra-terrestrial creature that looks like an angel. (When I reread the story recently for the first time in decades, I was startled to realize that the "old man" was 53. Why, he was a mere lad!) They commune telepathically, and he learns that she is from a world ten light-years away, her people have a seventy million year history, but their spaceship crashed on landing. A few survived, including her. She and her people wish to study us. She kills him, with his consent, willingly given to such a superior being. I read it much less skeptically at thirteen, of course.

"What is this 'angel' in your mind when you think of me?"

"A being men have imagined for centuries, when they thought of themselves as they might like to be and not as they are."

"Angel's Egg" announced some of Pangborn's characteristic themes and devices: sentimentality; contempt for humanity and other "lower" species, including women; an older bachelor who befriends a superior, much younger superbeing. Editor Conklin put it more nicely:

Many authors refuse to assume that mankind is the apex, the point of the pyramid, the tip of the top. ... Some authors take it for granted that the creatures from space will be friendly even though they are a few thousand years ahead of us ... and willing to work painstakingly with the few humans who have imagination and ability to learn, even though, in doing so, the aliens might become permanent exiles from their home planet.

Pangborn had been publishing mystery/crime fiction under various pseudonyms since the 1930s, but "Angel's Egg" put him on the science fiction map. He put out several sf novels during the 1950s, along with The Trial of Callista Blake, a crime novel that had a lot in common with Grace Metalious's best-selling Peyton Place. (Sexy scandal in small New England town, world-weary older male characters brooding over small-town narrow-mindedness, beautiful and troubled young woman.) His post-nuclear holocaust novel Davy was the next work of his I read after "Angel's Egg", my fourteen-year-old's attention caught by the buff male model draped over the cover of the paperback. I followed Pangborn's work after that, but ambivalently. I appreciated his skillful, lyrical writing and its growing homoeroticism, but was never really satisfied by it either.

I remember reading A Mirror for Observers in the early 1970s, about 20 years after it was first published, but other than that I remember nothing about it. I returned to it this month after seeing some comments about it online (via):

Both biographical and textual evidence support reading [Pangborn] as gay—or rather, to fit his time period (1909-1976), homosexual, since “gay” implies a certain post-Stonewall consciousness of and confidence about sexual identity.

[Pangborn's] work supports queer readings. He writes from an outsider position assigned by heteronormative culture, and he uses that position to critique social norms and institutions. The best example is his strongest novel, A Mirror for Observers (1954). Although Patricia Bizzell, in one of the few critical pieces on this neglected writer (Dictionary of Literary Biography, volume 8), reads the novel in terms of “an intense homoerotic friendship,” that seems to me a limited or even misguided interpretation of the relationship between the book’s Martian narrator and the gifted boy, Angelo, whom he mentors. Elmis, the Martian observer, is the key to the book; ancient and benevolent, sensitive and cantankerous, he is our species’ confirmed-bachelor uncle.

I disagree with this critic's distinction between "gay" and "homosexual" based on "time period" -- a number of writers of Pangborn's generation, such as Christopher Isherwood (1904-1986), are properly classified as gay writers, even when (like Isherwood) they hated the word "gay." A better description of Pangborn's manner, I suggest, would be "coded" or "closeted." This is probably connected to his work being genre rather than 'literary' fiction, American rather than British or Continental, and to the wishfully pederastic model of the relationships he invents.

So I got A Mirror for Observers at the library. (It's more or less easy to find, having been reissued in 2004 by a small press that hoped to put all of Pangborn's work back into print.) It's a weird, annoying book, with all the faults of Golden Age science fiction. It has the minor disadvantage of being set in the near future -- the 1960s and 1970s -- so you can't read it now without seeing how Pangborn's future history differs from reality; but that's the fun part. My serious objections are more temperamental: a lot of sf celebrates an elect who are set apart from the herd by their high IQs and supposed rationality, and who suffer accordingly. It's a very attractive theme for adolescents, but not a fantasy that should be cultivated and encouraged; it took me a long time to shake it off. It's also the core of A Mirror for Observers.

First, there are the Martians ("Salvayans"), who fled their dying planet to settle on Terra 30,000 years ago. By a remarkable coincidence, they arrived exactly 29,000 years before the beginning of the Christian era, so you only need to subtract 29,000 from the Salvayan dates given in the story to know what year it is in the US. (30,963 becomes 1963, for example.) The Salvayans mostly observe Star Trek's Prime Directive of non-interference with the natives, though at one point we're told that "It was all right to help some of the early tribes find [?] the bow and arrow the way we did, but times have changed." Gee, thanks, Salvayans! In their benign wisdom, they began the arms race, no doubt hoping we'd kill ourselves off, but it didn't work out that way. And just as well, since the Martians have not flourished on Terra; they survived, but not much better than that. They can pass among us by the use of prosthetic makeup and a scent-killer (the narrator is glad that the automobile has made horses largely obsolete, since the equine nose could smell them out); they dabble in our arts and adopt our vices (the narrator likes cigarettes but disapproves of barbiturates as a women's weakness); but they mainly seem to be marking time until they become extinct. It's hard to create credible extraterrestrials, let alone tell a story from their point of view, and Pangborn hardly tries; well, his Martians are assimilated immigrants after thirty thousand years, hardly aliens at all by now.

The year is 30,963. Observers from the North American Missions have spotted a twelve-year-old boy, Angelo Pontevecchio, who has the potential to be, in some obscure fashion, the One. In a subterranean bunker, director Drozma disputes with the evil Namir the Abdicator.

Namir yawned. “So? Did she mention Angelo Pontevecchio?”“Of course.”“I hope you don’t imagine you can do anything with that boy.”“What we hear of him interests us.”“Tchah! A human child, therefore potentially corrupt.” Namir pulled a man-made cigarette from his man-made clothes and rubbed his large human face in the smoke. “He shares that existence which another human animal has accurately described as ‘nasty, brutish, and short.’” ...“How can you observe through a sickness of hatred?”“I observe sharply, Drozma.”

There are echoes here of the biblical book of Job, with Namir as Satan and Drozma as Yahweh. Namir's malignity is basically devoid of motive; he's a moustache-twirling villain from nineteenth-century melodrama. Despite his dislike for organized religion, Pangborn was a basically religious writer, with a Manichaean view of the world, and when he tried to imagine human beings "as they might like to be and not as they are," his imagination fell as short as most religious writers' have done. Drozma, "painfully old now, painfully fat with age," can do nothing but rouse the Observer Elmis from his contemplation and send him to protect Angelo. Or something.

Elmis checks into Angelo's mother's boarding house, and it's love at first sight -- not for Mom, described by the kindly Salvayans as "‘sweet-minded.’ Not much education, and on a very different psychophysical level; a fat woman in poor health" -- but for Angelo.

I knew him at once, this golden-skinned boy with eyes so profoundly dark that iris and pupil blended in one sparkle. … When we admit that the simplest mind is a continuing mystery, what height of arrogance it would be to say that I know Angelo!

Angelo is precocious, reads Plato and Hegel, shows his sensual paintings to old ladies:

Three mares in a high meadow, heads lifted to the approach of a vast red stallion. Colors roared like mountain wind. A meeting of wind and sunlight, savage and joyful, shoutingly and gorgeously sexual. Angelo should have been spanked.

(Yeah, you'd enjoy that, wouldn't you?) Namir lurks in the background, trying to turn Angelo away from his path, whatever it is. Oh, and there's ten-year-old Sharon Brand, whose father runs the local delicatessen. She's a squirm-inducing Shirley Temple retread -- movies seem to have been the extent of Pangborn's interaction with children, boys or girls.

"Look," she said. "Heck, could this autothentically happen, I mean for true?" ...

"Mr. Miles, how can I ever abdiquately thank you?" ...

"By the way, I love you beyond comprehemption."

Namir succeeds in diverting Angelo from his path, whatever it is, but only for a season. In Part Two, set in 30,972, everyone changes names, and Pangborn tries to depict a nativist, cult-like political movement with a mad scientist, followed by cataclysm that wipes out about a third of the human race. Angelo, wounded spiritually, has withdrawn into himself. Elmis comes to the rescue, but with only partial success. The important thing is that Angelo and Sharon are reunited, and Angelo resumes painting. Elmis rhapsodizes:

Never, beautiful earth, never even at the height of the human storms have I forgotten you, my planet Earth, your forests and your fields, your oceans, the serenity of your mountains; he meadows, the continuing rivers, the incorruptible promise of returning spring.

Or, as Bette Davis told Paul Henreid at the end of Now, Voyager, "Don't ask for the moon when we have ... the stars!" I just remembered something someone wrote about Carlos Castaneda and his fictional Yaqui men of knowledge and power: that they love the Earth, but not the people on it. That seems to fit Pangborn as well. There are frequent references in A Mirror for Observers to the need for an advance to an "empirical ethics," but no hint of what Angelo is supposed to contribute to it, or what difference it would make. Pangborn kills off a major chunk of humanity including the faceless masses of Asia and Africa, with hardly a tremor; after all, we just clutter up those forests, fields, oceans and rivers, and corrupt the promise of spring.

Art seems to be his true touchstone -- Angelo is a painter, Sharon a concert pianist -- and that's all very well, but more as an escape from humanity than a key to us. Elmis also celebrates an orbiting satellite (and remember, this novel was written before people actually put such an object in space) as the "most dramatic achievement of human science, I think, and something more than science too -- a bright finger groping at the heavens." He praises Drozma for standing "apart, watching both worlds with a clarity I have never achieved." Well, so did Namir. A Mirror for Observers is a classically individualist work, with an ethic that is barely even tribal.

Sunday, December 27, 2009

A few posts back I threw a small conniption over the corporate / body metaphor of nations, inspired by poet Joy Harjo's line that "a nation is a person with a soul." I mentioned that this is a fascist doctrine, spluttering out some of what bothered me about it, but didn't get the important parts, so here's another try.

As I wrote before, in the Body Politic individuals may become mere cells to brushed off like dandruff when they're no longer needed. But some cells are more valued than others. The body is made up, in Paul the Apostle's imagery, of various members (Romans 12:3-8):

3For by the grace given me I say to every one of you: Do not think of yourself more highly than you ought, but rather think of yourself with sober judgment, in accordance with the measure of faith God has given you. 4Just as each of us has one body with many members, and these members do not all have the same function, 5so in Christ we who are many form one body, and each member belongs to all the others. 6We have different gifts, according to the grace given us. If a man's gift is prophesying, let him use it in proportion to his[b] faith. 7If it is serving, let him serve; if it is teaching, let him teach; 8if it is encouraging, let him encourage; if it is contributing to the needs of others, let him give generously; if it is leadership, let him govern diligently; if it is showing mercy, let him do it cheerfully.

But not all members are equal. If a member thinks of himself 'more highly than he ought,' he may become unruly and insubordinate. Some are born to lead, others born to follow, and they need each other, okay? Paul developed this theme in 1 Corinthians 12:14-:

14Now the body is not made up of one part but of many.... 21The eye cannot say to the hand, "I don't need you!" And the head cannot say to the feet, "I don't need you!" 22On the contrary, those parts of the body that seem to be weaker are indispensable, 23and the parts that we think are less honorable we treat with special honor. And the parts that are unpresentable are treated with special modesty, 24while our presentable parts need no special treatment. But God has combined the members of the body and has given greater honor to the parts that lacked it, 25so that there should be no division in the body, but that its parts should have equal concern for each other. 26If one part suffers, every part suffers with it; if one part is honored, every part rejoices with it. 27Now you are the body of Christ, and each one of you is a part of it. 28And in the church God has appointed first of all apostles, second prophets, third teachers, then workers of miracles, also those having gifts of healing, those able to help others, those with gifts of administration, and those speaking in different kinds of tongues. 29Are all apostles? Are all prophets? Are all teachers? Do all work miracles? 30Do all have gifts of healing? Do all speak in tongues[d]? Do all interpret? 31But eagerly desire[e] the greater gifts.

In practice, including Paul's, this nice doctrine of interdependence turns out to be a con. The head, despite its dependence on the feet and the "unpresentable parts", is still the head, and if Christ was the head of the body, different "members" of his body were heads of others:

3Now I want you to realize that the head of every man is Christ, and the head of the woman is man, and the head of Christ is God. 4Every man who prays or prophesies with his head covered dishonors his head. 5And every woman who prays or prophesies with her head uncovered dishonors her head—it is just as though her head were shaved.

Paul shifts here so quickly between literal and figurative use of "head" that it's not always easy to tell which mode he's in; but I hadn't noticed before that he's evidently saying that a woman who prays with her literal head uncovered dishonors her figurative head -- that is, the man who owns her. The dishonor, be it noticed, only moves upwards, toward the head -- or heads, since the Body of Christ turns out to be a hydra.

I'm digressing a bit here, but these passages have been both influential and representative -- by which I mean that Paul didn't make up these metaphors. He was borrowing a metaphor that his congregations would recognize and understand from other areas of their world, and applied it to his churches. And the metaphor persists today, even when it's not explicitly applied. Those who remember the US invasion of Iraq, our Operation Infinite Justice (I know, it's like so 2003! I should look to the future, not the past) will recall that politicians and pundits alike spoke of striking at Saddam Hussein, though in fact we hardly touched him for quite a while. The same language had been used in the first Gulf War in 1991, and during the Clinton administration too. This equation of Iraq's "head," Saddam, with the country made it easy for Americans to ignore the thousands of innocent people we were killing and hurting. They simply didn't figure in the festival of lights that was Shock and Awe.

I was surprised when Saddam was actually captured and executed -- in general heads of state prefer not to go after other heads of state. It's like a scene that was often used in old cartoons: two big guys square off for a fight, one says, "Take that!" and hits a small nerdy guy standing nearby. The other big guy bristles, growls, "Oh, yeah? Well, take that!" and hits the nerdy guy again. And so on: they never lay a finger on each other. And it may not be coincidental that the last American President to lay hands on another head of state was Bush's father, in his attack on Panama in 1989. (And as in the case of Saddam Hussein, Manuel Noriega had been a protected American client until he stopped being useful to us.) Ronald Reagan's 1986 blitzkrieg of Libya, though ostensibly directed at its leader Qadafy, killed civilians, including Qadafy's adopted 15-month-old daughter. Take that, Qadafy! (Americans do not react kindly when the same tactic is applied to us.)

But understanding this conception of head and body helps me grasp at last something that baffled me as a kid after the assassination of John F. Kennedy. Adults, both newscasters and people in the street, kept asking what would happen to America now? Could America survive the assassination of Our President? I was only 12, but I'd already imbibed in school the doctrine that America is a nation of laws, not men. I knew that though Kennedy's murder was a personal tragedy for him and for those who loved him, it couldn't hurt the country. Of course America would survive. A new President had already been sworn in. The death of any person is no less a tragedy, and I distrusted, as I still do, the belief that some people are superpeople, whose lives matter than those of ordinary folk.

Now I see that these people meant that the US had been beheaded, the head of our national body cut off, and what could keep America from falling over dead? Well, for a while it did seem that it was running around like a chicken with its head cut off. But even to say that is to accept the metaphor that a nation is a person with a soul, with members and unpresentable parts that are subordinate to the head. It's a metaphor that serves to justify inequality within a nation, and ignoring the humanity of people outside it.

Saturday, December 26, 2009

I've written before about biblical literalism and the confusion surrounding it, so I was pleased to read Dennis Baron's essay, "A Literal Paradox", in his collection Declining Grammar (Urbana, 1989). He begins with the text from a New Yorker cartoon of February 28, 1977:

Confound it, Hawkins, when I said I meant that literally, that was just a figure of speech!

I'd already noticed that many people use "literally" as an intensifier, just as many people use quotation marks for emphasis, but I'm so literal-minded myself that it hadn't sunk in that as an intensifer, "literally" means "figuratively." Baron explains that the paradoxical use of words to mean their opposite is a common, longstanding feature of English. "Sanction" (which can mean either to forbid or to permit something) is a well-known case, as is "dust" (either casting dust on an object or removing it). As with many changes in language, though we grammar neurotics rage and gnash our teeth, the figurative use of "literal" is probably not going to go away, especially since so many educated readers (the standard of competence in literacy) are unable to sort out the different meanings and uses of the word -- the current U. S. Secretary of Education, for example. This may not be a problem for most everyday use, but since accusing fundamentalists of reading the Bible literally is such a popular (and inaccurate) ploy, the misunderstanding of "literal" just adds to the mess. Or to the fun, I haven't decided yet. (Baron quotes a writer who warns, "Abuses of the word can seem ludicrous, and those who recognize them enjoy pointing them out" [79].)

Here's what I mean. "Begging the question" is a logical fallacy, known as petitio principii in Latin. It means assuming something that needs to be proved in an argument (and "argument" has to be understood not in its everyday sense of two people yelling at each other, but in its technical sense of "discourse intended to persuade" and "a coherent series of statements leading from a premise to a conclusion"). Outside of academic writing, though, it usually means that a statement raises a literal question that the writer or speaker "begs" to ask, e.g., "Elin Woods is divorcing Tiger -- which begs the question: Elin, when will you marry me?" and that's a reasonable mutation of the meaning. What bothers me is that I find similar misunderstanding of what it means to beg the question even in academic writing. Misunderstanding the technical terms of one's profession doesn't have the real-world consequences in the humanities that it could in the sciences (if your doctor, for example, thought that your femur was a small matriarchal primate instead of your thighbone), but it still seems to me a problem worth taking seriously. In literary criticism, it means at best that the critic has a tin ear for language, so his or her readings can't be trusted.

So, back to "literally." I find accusations of fundamentalist "literalism" in all sorts of writings, including religious professionals who ought to know better, but also among educated laypeople who love to level the charge. Reading Baron's essay sent me back to a Usenet dispute in which I was involved a decade ago. I pointed out that fundamentalists believe that the Bible is inerrant, and must often come up with very non-literal interpretations in order to get rid of errors in the plain, literal meaning of the text. Another person asked me, "Have you conflated 'literal' with 'literary'? I'm confused here." No, I hadn't, and it didn't seem that this guy knew what either word meant. If he were a doctor, I wouldn't trust him with my femur.

Baron, however, gave me a Mobius-strip-like idea. If "literally" often means "figuratively," when anti-fundamentalists accuse fundamentalists of taking the Bible literally, do they really mean that fundamentalists take the Bible figuratively? When they advocate figurative readings, do they mean literal ones?

Thursday, December 24, 2009

This comment today on a post lamenting Obama's strategic decisions and choices at A Tiny Revolution. The comment posted by N E at December 24, 2009 09:55 AM. (No permalink.)

FDR was a good one, but Hoover was actually trying some of those New Deal policies too. There was pressure from the left then. That is what is needed now too. Obama never had a real movement behind him, and unfortunately it doesn't look to me like he believes he can start one. Or if he does, he doesn't know how to do it. He is doing a very bad job at motivational leadership, which is very important.

I'm not sure FDR had "a real movement behind him"; rather, he had real movements putting pressure on him. I think that N E, like so many Obama apologists, believes that Obama would like to do things differently, but doggone it, he just can't -- he's a sweet guy, but "doesn't know how to do it." This is doubtful on many grounds, but foremost among them is this one. (By the way, I must say I wish I had people who'd follow me around, loudly absolving me of all responsibility for everything I do.)

Obama didn't want a real movement behind him, because a real movement might develop a mind of its own. He tried to coopt in advance the kind of people who might start one with his own organization, which was very successful in terms of the goal he set for it: to get him into office. That organization was not exactly invisible: the Obama campaign was singled out by Advertising Age for excellence in marketing, beating out Apple Computers. Immediately after the election, the Democratic leadership and the corporate media were nervous that Obama's supporters -- by which term they did not mean his corporate supporters -- would want a say in how he governed. Obama himself has made clear his dislike of any insubordination among the proles, whether in or out of his army. But not to worry, there hasn't been any serious restiveness yet.

I agree that he's doing a bad job at motivational leadership, but I think the way N E phrased this reveals that he accepts the framework of shepherd/sheep as a model for leadership: the idea that the followers will do whatever the leader wants if he just motivates them in the right way. (What motivates his apologists, I wonder -- all those Obama hasbaristas? I don't imagine they're being paid, or even asked personally to mount the defense, yet they are quite dogged in their loyalty and energy and readiness to vilify all of Obama's critics.) I've seen some indication that some of Obama's ground troops are voting with their feet, and drifting out of involvement. I imagine next November will tell us more. But judging by the e-mails I've seen, forwarded to me by a friend who gets them from the Obama organization, I think Obama doesn't realize how upset many of his former supporters are. He assumes that everything he's doing is fine with them, and they only need to be motivated properly, then given their orders. That isn't how it works. The "movement" has to have goals it believes in, and "Obama in 2012" isn't enough for the grass roots unless it means something other than war, torture, and more corporate welfare. Obama's legacy could turn out to be revealing the limits of voting as a medium of change.

Now, I agree, and have oftenwrittenaboutit, that many of Obama's fans had inaccurate and unrealistic beliefs about what he stood for and intended to do, let alone what he could do. But I won't blame them for deserting him when they realize that they were wrong about him. What else should they do, change their goals and values? Continue to support him while he stuffs money into the coffers of corporations and devastates the world? No doubt N E thinks so; certainly Obama thinks so. But I'm not the only one who doesn't think so.

In a previous comment thread at A Tiny Revolution, N E and another commenter indicated what they thought the solution might be. N E first:

Our national game is Capitalism, with a fractional-reserve banking system controlled by private banks being the linchpin of the whole system, so big banks are even more important to those players who want to win than Board Walk and Park Place are in Monopoly, whereas people collectively don't even have a crappy little square like Meditteranean Avenue and individually aren't worth a damn thing. It doesn't matter who is playing the game, even the some-call-him Obaminator, 'cuz that's the game fellas.

So change the game.

I love people who adopt this sort of faux-cynical, worldly-wise realist tone. There are certain problems with what I'd laughingly call the substance, however. N E pictures his game of Capitalism as being played by Obama on one side with, well, it's hard to tell who's on the other side. But he does seem to cast Obama as their opponent instead of their collaborator, let alone their instrument; that fits with his general Obama-apologist approach, that Obama is on the people's side but he's just overwhelmed by the Forces of Evil. That doesn't fit well, however, with the metaphor of Obama as a player in the game.

If you play Monopoly (N E's model for this flight of fancy), you're there to acquire Monopoly property and accumulate Monopoly bucks and put the other players out of business. But the President of the United States is not, in theory or in practice, a player in the game. He won't win by putting the other players out of business, and he's not supposed to pile up wealth while he's in office. (Afterwards is another story.) So the metaphor breaks down there.

Further, Capitalism is not a game, and I've written before about the error involved in equating sport with political economy. but even if it were, it is only one game being played in the United States, however big and powerful it is. Most Americans aren't even in the game, for one thing; we're the runts and nerds nobody wanted when sides were chosen. One could move to the model of spectator sports, which many capitalists would like us to do. We can root for our favorite team, wear its colors, and believe that it matters whether it or the other team wins. There's even the tantalizing illusion that we get to choose our favorite team -- Coke or Pepsi, Windows or Mac, Avatar or Alvin -- because Free Choice is what America is all about.

For another thing, there are no rules in the game that can't be changed -- the corporate "players" expect to be able to do it at will, of course, but they needn't be the only ones. In theory, though, and sometimes in practice, there is also a game called Government, which Capitalism has always tried to control and undermine. That's the game Obama is supposed to be playing, but he seems to be a bit vague about his role in it, partly as a result of spending too much time hanging out with the players from Capitalism. The point of the blog post by Jon Schwarz that started me off today was basically that: who is Obama playing for, and is he a loser or just confused?

So, change the game? Another commenter chimed in with this: "In other words, a revolution. Glad to see we agree NE!" Ah, revolution. Leonard Cohen once said, back in the 60s, that every time you use the word "revolution" it gets delayed ten seconds. (Which means we're probably safe for the next several centuries.) After all, we had a revolution in this country, and it left basically the same people in power. I'd have to have some reason to believe that another one would do any different, and I'm not very trusting. Whose revolution? The people's? Hah.

Right now I'm reading a new book, The Battle of the Story of the Battle of Seattle by David Solnit and Rebecca Solnit, published by AK Press. The title sums it up: who gets to tell the story of the shutdown of the World Trade Organization summit in Seattle in 1999? The corporate media have established the template of a horrific outbreak of anarchist violence, a blow struck at civilization itself, by a bunch of ignorant tree-huggers who pretend to care about the world's poor but are only spoiled brats; it is the WTO and the bold Captains of Free Enterprise who truly care, and who alone hold out hope to the huddled masses.

The reality was somewhat different -- most of the violence was police violence, the police were not out of control but following orders, and the protesters were not only people of all ages and backgrounds but from all over the world; the shutdown also decisively weakened US dominance in the WTO, a blow from which the organization still has not recovered. And the protesters did this without either large amounts of money or the kind of hierarchical, top-town organization that most people take for granted is necessary to achieve such change. (David Solnit remarks that when he was told the budget of Stuart Townsend's 2007 film Battle in Seattle -- $10 million, small for a Hollywood film -- all he could think was that you could fund a hundred shutdowns with that much money.)

How did they do it? David Solnit provides a list of strategic principles compiled by a group of Seattle organizers, which boil down to engaging the media, decentralization, open organizing, clear what-and-why logic, and prior agreements about the types of direct action to use. This sort of thing is unfashionable, because it goes against the grain of professionalization and hierarchy that define a lot of movement work nowadays. A year before Seattle, the Human Rights Campaign and the Metropolitan Community Church decided to organize a Millennium March on Washington, with closed, top-down organization, big names, vagueness about what-and-why (notice the closety name), and corporate sponsorship. It finally took place in 2000 and drew a fair-sized crowd, but not as many as its organizers had hoped, and lost a lot of money. In those respects it failed to achieve its goals insofar as it had any goals beyond partying. Partying is all very well, and certainly accompanied previous gay marches, but it was never clear what political aims the Millennium March had.

Joshua Gamson wrote in The Nation, "The LGBT movement has shifted from one of loosely affiliated activists to one of organizations. Understandably, this freaks some people out. An organizational movement is a different sort of creature, and some of the opposition to the Millennium March is just a recognition that if you're not a member of an organization in the LGBT movement in the twenty-first century, the creature may well bustle along without you." This is false on just about every point. The LGBT movement in the US has always been one of organizations, not "loosely affiliated activists", from Mattachine in 1948 to the Daughters of Bilitis in 1955 to the Gay Liberation Front, Gay Activists Alliance, and National Gay and Lesbian Task Force after 1969. Much of the opposition to the March came from organizations; the individuals Gamson names were often not activists, but citizens expressing their opinions.

The question that the Millennium March stirred up was What kind of organization? Like Gamson, defenders of the March that I engaged relied on a Marxist, dustbin-of-history attitude: HRC was the wave of the future and other kinds of organization were so Seventies, so get with the program or be left behind. HRC's style of money-driven lobbying, built on achieving access to Washington, has its place no doubt, but as the Millennium March showed, it is also a dead end. Richard Goldstein wrote in the Village Voice that when veteran organizer Mandy Carter "tried to persuade [HRC Executive Director Elizabeth] Birch to embrace a more open organizing process last year, "her feeling was 'Why do we need to take time for those meetings?' For her, it just didn't make sense." It seems that Birch, who once invited a reporter to imagine that "you woke up and found that someone had handed you the movement ... I'll bet you that would have made most of the same decisions I've made", never learned why it was her style of organizing that didn't make sense.

So, revolution? I don't think so, unless the word is used to mean any kind of serious change, which takes out so much of the romance out of it, such as following a bare-breasted babe over a street paved with corpses. Ah, the Struggle! the Glory! But think of what you can do without romance: you can join a movement that brings a powerful international organization sponsored by the United States to its knees. A movement like that could make Barack Obama and his partners in the game of Capitalism take notice. Nothing else will do it.

Wednesday, December 23, 2009

Plenty of people have had nasty things to say about Obama's health care bill (it's his now, just as Afghanistan and Iraq are his wars), and I don't have anything to add to them. Obama loyalists are manning the barricades to cast dust in the people's eyes, but both sides are really irrelevant anyway, aren't they? Obama didn't have private meetings with critics of the bill, any more than he spends afternoons playing golf with them.

Though as Glenn Greenwald wrote yesterday, "the mere fact that the health insurance industry and the market generally sees this 'reform' bill as a huge boost to the industry's profitability does not prove, by itself, that this is a bad bill." No, it's "the corrupt, mandate-based strengthening of the private insurance industry, the major advancement of the corporatism model of government, the harm this is likely to do to some who are now covered and some who cannot afford the forced premiums, and the chances for a better bill if this one is defeated." (To be fair and balanced, I should add that Greenwald is probably far too optimistic on that last point, and that he sees some benefits in the bill as it stands.)

But something small caught my bleary eye last night while I was surfing the Web. As a grammar neurotic, I couldn't help noticing that President Obama told the Washington Post (via), "Every single criteria for reform I put forward is in this bill." That should have been "every single criterion", of course: "criteria" is plural.

This doesn't affect my opinion of Obama; it's a common slip. But I was reminded of all the Democrats who've been exulting for the past year that we finally have a President who can speak proper English. Do they notice Obama's errors? Probably not: in contrast to "nukular," which is a regional variation associated with the supposedly backward South, ignorance of plural and singular forms of certain nouns is common among the Obamatariat -- "phenomena" also seems well on its way to joining "data" and "media" as a singular form. So are the inability to use apostrophes correctly, the confusion of "rein" and "reign", and so on. Such things wouldn't be worth noticing or mentioning if the people involved hadn't been so self-righteous about Obama's predecessor. I've often thought that what really upset so many liberals was not Bush's policies but his accent; their reaction when Obama continues his policies, their reluctance to speak out against his wars for example, tends to confirm my suspicion.

P.S. Here's another example (via) from Nashville Toys for Tots coordinator Staff Sgt. David Carrier, explaining why children of parents without Social Security cards will receive only coal in their stockings: " ... but we have set a criteria." Susie of Suburban Guerilla called it "a very un-Christian thing to do", which reminds me of a Bible story.

24Jesus left that place and went to the vicinity of Tyre. He entered a house and did not want anyone to know it; yet he could not keep his presence secret. 25In fact, as soon as she heard about him, a woman whose little daughter was possessed by an evil spirit came and fell at his feet. 26The woman was a Greek, born in Syrian Phoenicia. She begged Jesus to drive the demon out of her daughter.

27"First let the children eat all they want," he told her, "for it is not right to take the children's bread and toss it to their dogs."

28"Yes, Lord," she replied, "but even the dogs under the table eat the children's crumbs."

29Then he told her, "For such a reply, you may go; the demon has left your daughter."

30She went home and found her child lying on the bed, and the demon gone.

Jesus, to his limited credit, had a sense of shame. Luckily for her daughter, the Syrophoenician woman had her wits about her, even if she didn't have a Social Security card.

P.P.S. I'm reading Declining Grammar and Other Essays on the English Vocabulary by Dennis Baron, Professor of English and Linguistics at the University of Illinois-Urbana, originally published in 1989. In Chapter 10, "Academies of One: The Critics and English Usage," Baron discusses plurals and the confusion that surrounds them. Children, it turns out, is really a "double plural ... which shows an -en plural (as in oxen and brethren) to an obsolete plural in -er".

Double plurals are more common than we think. Quite a few of our singulars were once plural, including a number of French borrowings that developed new plurals once they came into English: apprentice is from the French apprentis (sg. apprenti), invoice from envois (sg. envoi) ... Tweezers comes from the French etuis (sg. etui), 'case,' and was originally (a pair of) twees. Native English breeches (from Old English singular broc, plural breech) is a double plural, as is bodices (bodice is actually bodies, plural of body).

Since criteria comes from Latin, it may not be surprising that even a Harvard man like President Obama is unclear that it's a plural. Certainly many Americans are, though it still seems weird to me -- is criterion so much rarer than criteria that they haven't heard of it? Language changes, though, and it may well be that in a generation or two, criteria will be a standard English singular, with a new double plural form added on.

Tuesday, December 22, 2009

I had an odd, educational conversation in a gay chat room earlier today. I was making small talk with another American, who's about to make his first trip to Korea; he is from California, and mentioned how upset he was by Proposition 8. We agreed that the No on 8 campaign had not been very well run, and I said that I thought part of the problem is that many gay people, including younger ones, seem to be denying the continuing force of homophobia/antigay bigotry in the US.

There's something paradoxical going on with that, because on one hand they are highly aware of bigotry as a threat before they come out, but on the other, when it comes to something like same-sex marriage, they seem to be living in a TV movie where all you have to do is assert The Right Thing and everyone magically comes around in time for the end credits. Maybe one bad-guy bigot remains, but he or she is either ridiculed or exiled. So a lot of the opponents of Proposition 8 seemed to be taken utterly by surprise when they encountered real, serious, deep-rooted opposition. (I mean, like, it's marriage, and marriage is good! Everybody should get married! And it's about equality, and equality is good! How could anyone be against it?)

But it wasn't just kids I had in mind. The professional operatives who ran No on 8 seemed equally unprepared, evidently thinking that a few TV ads would send Evil Mr. Proposition 8 back to his den, muttering "Curses! Foiled again!" Given that money was tight, a volunteer-based grass-roots campaign would have been much more cost-effective. (A California-based friend reminded me soon after the debacle that No on 8 had to compete with the Obama campaign for money and youthful idealism, which is a fair point; but the professionals tend to be opposed to grass-roots work on principle.)

Anyway, my interlocutor and I had just agreed about the denial at work in a lot of gay people's reactions to bigotry when another guy in the chat room intervened. (Thirty years old, Caucasian, chatting from Korea.) He told us that for most gay people, marriage isn't an issue, since marriage is a dying institution. Gay teens don't care about it (!), so they didn't get involved in No on 8. Marriage, he declared, should not be a civil institution. But Proposition 8 was the first time discrimination had been written into a constitution. What he said, aside from being wrong-headed (In My Hubristic Opinion), was irrelevant to what we'd been talking about, and I told him so: we'd been talking primarily about the adults who ran the No on 8 campaign, and that younger people's sense of denial about bigotry had nothing particular to do with marriage. I told him that marriage wasn't a big issue for me either (though I might have added that the young gay kids I work with on Speakers Bureau are mostly very pro-marriage -- more like pro-wedding, really). And even keeping it on his level, Colorado's Amendment 2, which also inscribed anti-gay discrimination into a state constitution, predated Prop 8 by sixteen years. Yes, it was overturned by the Supreme Court; Proposition 8 may also fall, one way or another. But it was not the first, not even the first state constitutional amendment to define same-sex marriage out of existence.

What really seemed to concern this guy, though, was "identity politics" and "playing victim," with an accompanying sense of entitlement, all of which he called "pathetic." He argued that we should just treat people as individuals, not as colors or sexes or sexual orientations, which is what he did, and what was I doing to change society that was as significant as that? I commented that he was throwing out prefabricated boilerplate phrases, and pointed out that antigay bigots, along with Teabag Nation and Republicans generally, also like to present themselves as victims. It's not limited to the standard minorities, who do have real grievances for being treated as their skin color, their sex, their sexual orientation.

He then flipped stances and argued that organizing was the only way we were going to change society, and how did I propose to get gay teens involved in the fight for gay marriage? I reminded him that I don't care if gay teens get involved in that fight, and asked him why he had suddenly decided that "identity politics" was not pathetic after all but a necessary tool for organizing, and why gay marriage was suddenly worth fighting for? He didn't seem to have an answer, and resorted to bluster: so what was I doing for equality and change? I asked him why I should bother to talk to someone who'd simply ignored the content of the conversation he'd joined, who had nothing but slogans to contribute, and kept changing his principles from minute to minute without, apparently, being aware that he was doing so. And there it more or less rested; it was lunch time, and I saw no point in continuing the conversation. (The first guy had dropped out of it early on, to run errands of his own.)

For the record, my personal contribution to equality and change is uncommendably modest. Deciding to be openly gay in a Midwestern college town in 1971 was still a fairly bold decision for the time, and I know I had an effect on the opinions of numerous people, gay and straight, but I'm fully aware of the limitations of such individual choices. I got involved in gay organizations as soon as I found some, but I often found them frustrating because they seemed to have been started without any clear goals, just because organizations were springing up all over the place in those days. But having organized, most people didn't seem to know what to do from there. A visible presence on campus and in town, supplying speakers to classes and other straight audiences, setting up a telephone hotline for peer counseling -- all these were good and important, and of course I'm still running the speakers bureau. Some people came to GLF meetings and demanded to know why we weren't lobbying the state legislature, pressuring Congress, marching in the streets. We'd say sure, do you want to get to work on that? They didn't, but they expected us to; activism as a job for servants. Nowadays there's a state-level gay-rights and lobbying organization, run by professional operatives. One of its presidents, from the 90s, was a gay Republican who, inspired by a gay Democrat to see the potential in gay politics, went from the closet to the head of a state organization in record time -- less than a year, it seemed to me. That's not necessarily a bad thing, but it made me wary.

Identity politics has problems as a strategy, as black organizations (for example) discovered when Clarence Thomas was nominated to the US Supreme Court in 1991: should "race" take precedence over Thomas's record as a Reaganite collaborator who climbed to prominence over the bodies of his people? In the end it did so, to many people's chagrin. But identity politics is also a useful organizing tool -- is it even possible to organize people without offering them a group identity, a movement, to organize them into? Just about everybody denounces identity politics these days, with old New Leftists blaming it on postmodernists and postmodernists blaming it on the left and postcolonialists blaming it on the West, so that's an indication that something is wrong. Not that I know what it is.

The reason why today's conversation made an impression on me is that it summed up what is, for me, wrong with so much political discourse. Not just today (it's an old problem), not just in America, and not only on the Internet, but in print media and broadcast media and face-to-face interaction. Primed with slogans and misinformation, people don't listen to the other side enough to know whether, let alone why, they disagree. It isn't easy, as I know very well. The biggest irony was my challenger's insistence that The Answer was to treat people as individuals, when he couldn't be bothered to the listen to the individuals he was chatting with.

Monday, December 21, 2009

I took another look at the blog post that led me to write about the War on Christmas, because a word from it had been echoing in my mind all day. That word was "inclusive," which I had trouble believing was really in there because it's such a Politically Correct word to those Canutes who want to return to the 1950s if not the 1890s. But there it was:

There has been a movement in the past decade or so to make Christmas all-inclusive, to call it "holiday" and to expunge any reference to Christmas. Well, Christmas has always been inclusive -- never exclusive. Changing the name to "holiday" does not change the inclusivity of Christmas. It belongs to everyone, but it is still Christmas. How on earth did we allow Christmas to become politically incorrect?

Jo is flat wrong here: Christmas is not "inclusive." Saying "Merry Christmas," for her, is intended to slather Christmas all over the celebrations of non-Christians. (As Stephen Colbert said, there are infinite paths to accepting Jesus Christ as your personal savior.) In one respect that's nothing new, since so much of Christmas as we USAns and Canadians observe it is non-Christian: the Yule, the tree, the holly, the mistletoe -- even the prominent role of Santa Claus / Saint Nicholas / Father Christmas has nothing to do with the Mediterranean dying-and-rising god whose birth is commemorated on December 25. But all this is at best syncretism, not inclusiveness: at worst it's forced conversion and assimilation. Christianity absorbed a good deal of local religious forms as it spread all over the world, and often it is difficult to say for sure who absorbed whom: was Rome Christianized, or was Christianity Romanized? The Korean Christians I know, for example, have kept the form of a Confucian funeral and reverence for the dead, baptizing them as it were. This is fine as far as it goes, and it's not unique to Christianity by any means.

Actually "Happy Holidays" is the inclusive phrase, because it includes Christmas with New Year's, the Winter Solstice, Hanukkah, Boxing Day, Kwanzaa, and Epiphany. It does not, as Jo claimed, "expunge any reference to Christmas" any more than it expunges any reference to the other holidays. Indeed, that is the real crime of "Happy Holidays": it treats Christmas as just one more holiday, even if primus inter pares. The same attitude is exhibited by people who object to calling heterosexuality a sexual orientation because to do so implies that homosexuality is equal to heterosexuality. I suspect it's also involved when someone denies that Christianity is a religion, because "religion" is what the heathen believe, while Christianity is a Relationship with God or some such nonsense.

Sometimes, it's true, mere ethnocentric ignorance is involved. A young Pentecostal woman I used to work with asked me one day if I'd be going to church for Easter. I explained that as an atheist, I never go to church. "I thought everybody went to church on Easter," she said with unselfconscious directness, and kept repeating that refrain as I explained that Jews don't go to church for Easter either, since though they believe in the same God they don't believe in Jesus; that Hindus, Muslims, and Buddhists don't worship Jesus, so they don't celebrate Easter either. "But I thought everybody went to church on Easter," she insisted. That, I think, is the kind of inclusiveness that Jo has in mind when she says that Christmas is inclusive.

Her attitude wasn't ill-intentioned, but her kind of ignorance, when combined with the War on Christmas crowd's hostility to anyone who won't agree that they own December 25, isn't benevolent either. One of my Facebook friends -- the same one, in fact, who claimed falsely that President Obama had a "holiday tree" in the White House a month back -- posted this weekend to the effect that "They" want to take "Christ out of Christmas." (You know, Them: the same shadowy figures who ruin your favorite movie by making inferior remakes. But then people who'd do that would do anything.) I know she probably just pasted something she'd found online into her status again. She also had it backwards: folks like her won't be satisfied until they've forced Christ into everybody's Christmas, whether we're Christians or not. If the church of her choice hasn't put enough Christ into its Christmas, she should do something about it or find another church. But outside that haven, she needs to mind her own business. Christ never was in my Christmas, and I won't let him in.

Sunday, December 20, 2009

Roy Edroso of alicublog has been writing for the Village Voice for about a year now, I think. He's a good writer, and I mostly respect him even when I disagree with him, as I often do. He's a very valuable chronicler of the right wing of the blogosphere -- I'm very glad he reads Ann Althouse, Michelle Malkin, Glenn Reynolds, and the like so I don't have to. His latest piece, "Social Media Ruined the Internet", is a bit of a departure from his Voice writings and even from his cultural posts at alicublog. He sums it up with some irony:

In brief, the tech revolution has brought us some clear benefits (e.g., LOLcats, free porn), but when it comes to thinking and communicating, it's been a net loss. (Hoists snifter) Perhaps you disagree?

Well, yes and no. In the column itself he writes that the Internet

has evolved to the point where it can't do much more for you. Which is to say, it isn't going to get any better: it will add features, but will basically remain the same tool: a super TV that you can talk back to.

Ah, talking back -- now, there was an innovation: Social Media, the last significant piece in the internet evolution, and beginning of the end of the dream.

I should mention that I -- like a lot of people, I think -- use "the Internet" rather loosely, to refer not only to the World Wide Web, which is the text/image/audio/video interface of today, but to early incarnations that were generally text-only, like Usenet, Fidonet, Compuserve, and America Online, but which enabled people to connect from their homes, over telephone lines at first, and interact with people elsewhere: potentially with everyone who had access to those networks. I'm talking here about the mid-1980s, when I bought my first Commodore 64 and, soon after, a 300-baud modem.

Far fewer people in those days had computers, and the computers they had were slower and smaller in terms of memory and storage, but it was tremendously exciting to exchange messages, even chat in real time, with people on the American coasts, and occasionally on the other side of the world. I had a Compuserve account for a while, but dropped it when I got my first $90 bill for a month's activity. I never got onto America Online, but I later met people who did, and learned from gay men that AOL and its competitors were very handy cruising and socializing sites. So I'm confused by Roy Edroso's claim that social networking is a recent, even the latest Internet development. It's been there for over two decades.

Edroso is also critical of blogs, somewhat ironic for a blogger, as he admits. Which doesn't mean he's wrong.

It turned out that the internet wasn't an advanced, processing brain, after all, nor an agent of meaningful change. In the political realm, it has revealed only had one enduring value: as a propaganda tool.

This really baffles me. No one who knew anything about computers or their potential could have believed that the Internet was "an advanced, processing brain", could they? Actually, I guess they did; and for most citizens, including early personal-computer users, their notion of the potential of computers came mainly from the Terminator movies. The fantasy that Artificial Intelligence is just around the corner, followed closely by the Obsolescence of Homo Sapiens, is still with us, like predictions of the Rapture. But it's every bit as bogus: the Internet, let alone any individual computer, is not an agent of anything.

A few people have become better informed about national issues because of it, but far more have been made to know to a certainly that the Congressional health care plan includes "death panels," that Obama is a Muslim born in Kenya who will turn America socialist, and that his wife is Marie Antoinette. We admit our own part in this, for we would rather focus on the latest batshit crazy thing Michele Bachmann said than on the details of the East Anglia email scandal (which has proved for many that "global warming is a fraud").

But is this really new, or specific to the Internet? True, the Net enables people to disseminate their writings more widely and quickly and cheaply than print or broadcasting did, but the US has a long colorful history of batshit crazy commentators talking over the radio, spitting out cheaply printed or mimeographed broadsides (that very word embodies a long history of printed political ranting) whose content spread widely. As one commenter at the Voice pointed out, "the Federalists didn't have e-mail in 1800, but their attacks on Jefferson were every bit as scurrilous and widespread as the right's smears."

Churches were also good vehicles for influencing communities. In the 1970s and 1980s I worked with Pentecostals who passed around photocopied tracts about how the communist World Council of Churches was scheming to impose a One World Government that would take all references to the saving Blood of Jesus Christ from all Bibles, and make everyone wear the Mark of the Beast or they wouldn't be allowed to buy or sell. Some of this was recycled from Hal Lindsey's infamous The Late Great Planet Earth, but Lindsey was recycling material he'd picked up from other writers and preachers. Word of mouth has killed more than a few people in this country and elsewhere, stirring up lynch mobs without a microchip or a fiber optic involved. "Thanks to blogs," Edroso wrote, "our political discourse now reads like Red Channels mixed with an Andrew Breitbart monologue." I seem to remember it was always like that. Maybe the Net has made things bigger, faster, and worser, but I still think the difference is one of degree, not of kind.

I've long noticed a strain of snobbism in the ambivalence many educated people (including me) feel about the spread of computers to Joe Sixpack, the Unwashed Masses, and the Housewife.

The non-news Top Twitter Trending Topics of the year include Michael Jackson, Harry Potter, and American Idol. Perhaps you feel as if you became better informed on these subjects because of Twitter, YouTube, Flickr, and Digg. More likely, you were just more inundated with them; you got more video and audio clips, saw more trailers and red-carpet photos, and read more gossip and reiterations of the same bare facts about them. What did social media teach you about Michael Jackson, besides how big a deal it was that he was dead?

And as for Facebook, MySpace, LinkedIn, etc, we can hardly tell you anything you haven't discovered yourself. You have XXX friends; you have XX invitations; so-and-so likes this; view all X comments. These are wonderful tools for shut-ins, of which they have made us all.

Well, yeah, but again the difference is at most one of degree. Supermarket tabloids, the Sports section, Hollywood publicity magazines and entertainment infomercials have long been with us, and many Americans have always paid more attention to the doings of the stars than to politics. The Black Sox scandal of 1919 disillusioned many ("Say it ain't so, Joe!"), but was it more important than the Red Scare and the plight of World War I veterans? I know I have just uttered blasphemy and sacrilege, but I'm an all-purpose infidel and I'm used to it. As for Facebook et al., as I mentioned earlier they are new mainly by comparison to the telephone and the telegraph.

Edroso sums up:

The real force behind blogs, Twitter, and all other social media is its users, which is to say, practically everyone of the internet. And this is the saddest part of the demise of the internet as anything other than a microwave for the mind: we are the ones who killed it. And no matter how feverishly we click and scroll and friend and block, nothing we do can bring it back to life.

I think that reports of the Internet's death are somewhat exaggerated. As I commented,

As I recall the early cheerleading for the Internet, it was like most cheerleading for new or "new" technology: partly corporate marketing blather (part of the point was to convince people that every American must not only have their own computer, but a superfast, superadvanced computer on which to file recipes and write e-mail) and partly technogeek masturbation (technology will save the world! or at least I need to have a personal superconducting supercomputer to write code and play Myst on, which will change the world and give every American a faster, more powerful computer to file recipes and write e-mail on, and that will save the world). And there's always been an uneasy feeling among the geek elites that they didn't want to share this wonderful new technology with housewives and couch potatoes who would just waste it on e-mail, filing recipes and analyzing sports stats instead of inventing new computer languages and more advanced role playing games, as God intended. We should also remember America Online, Compuserve, and the like, which also functioned as social sites and provided conduits for netlore that previously had been lower-tech Xeroxlore and low-tech handcopied folklore.

And I disagree with [another commenter]. The dumbification of, say, movie criticism predates the Internet. It is connected to the corporatization of journalism, but then movie reviewing originated as part of Hollywood marketing, and remains so to this day. See Jonathan Rosenbaum's Movie Wars (yeah, it's like totally a print publication, so no link) on the way that, as a serious film critic/reviewer moves up the prestige chain in print media, the space he or she is allotted shrinks. This has to do more with conscious strategies of keeping it simple, by the upper echelons who think of their markets as stupid and uninterested in thinking, whether their markets fit the image, than with any inherent limitation of print. And while yeah, Sturgeon's Law applies to Internet content as much as anywhere else, the Internet also makes it possible for writers to stretch out with less concern about space limitations or what the Advertisers (like the movie companies) will think. The problem is just finding the good stuff, but then it's always been true. "Of the writing of books there is no end." "Another d----d fat square book! Scribble scribble scribble, eh, Mr. Gibbon?"

Just about everyone, it seems, complains about how much junk there is on the Internet. The fault is always the fault of someone else -- Them, the Dumb Ones, and so on. I confess I often feel that way when I'm on Myspace with its "pimped" profiles, so overloaded with graphics and music that they are literally unreadable, and utterly tacky and devoid of all taste. Still, all that gives their users and proprietors as much pleasure as customizing their personal copies of Emacs gives many computer programmers. (Or used to give -- I'm out of touch.) And if it weren't for the Great Unwashed out there with their Nascar obsessions, what would the elite wannabe's have to play Ain't It Awful over? It doesn't hurt me, or stop me from writing and posting what I want to write and post. Nor does it stop anyone else. If it were decided to bar Teh Stupid from the Net, who would be the gatekeepers? Who would watch the watchers?