The Most Eccentric New Yorkers and the Writer Who Loved Themhttps://www.thenation.com/article/the-most-eccentric-new-yorkers-and-the-writer-who-loved-them/Robert S. BoyntonJul 16, 2015

In the winter of 1988, I arrived in New York with more enthusiasm than good sense—and no journalism experience—hoping to become a writer. Although only an intern at The Nation, I was casting around for people I might eventually profile. One such character was a brilliant autodidact, who happened to live in the apartment one floor below mine.

“That sounds like Joe Mitchell’s pieces about Joe Gould,” a friend of mine commented when I mentioned the idea. The blank look on my face betrayed the fact that I hadn’t heard of either. “You want to be a writer and you don’t even know who Joseph Mitchell is?” he said, voice dripping with scorn.

Humiliated, I did some research and discovered that Mitchell’s last published article, “Joe Gould’s Secret”—about the Village eccentric also known as Professor Seagull, who claimed to be the author of the longest book ever written, the (entirely imaginary) Oral History of Our Time—had appeared in The New Yorker in 1964, a year after I was born. There were no paperback editions of Mitchell’s four collections, and the hardcovers were long out of print, fetching steep prices in secondhand bookstores, if you could find them at all. I also discovered an old photograph of Mitchell, and learned that he lived with his wife and daughters in a modest apartment building on West 10th Street. It wasn’t long before I spotted him strolling down lower Fifth Avenue, wearing his signature three-piece Brooks Brothers suit, freshly polished shoes, and a fedora. I still hadn’t read anything by him, but at least I knew who he was.

According to Thomas Kunkel’s biography, Man in Profile, I wasn’t alone. By the late ’80s, Mitchell “had come to grasp the dreadful irony: If he was known by a modern audience at all, it was for not writing.” A Mitchell renaissance began in 1992, when Pantheon Books published Up in the Old Hotel, his collected New Yorker nonfiction, which spent several weeks on the bestseller list and still sells steadily in paperback. A new generation of readers discovered Mitchell (Born Again was the title he suggested for the collection), and interest in his work continued to grow after he died from lung cancer in 1996, at the age of 87, the year Joe Gould’s Secret was reissued as a stand-alone volume. A charming movie version of it, starring Stanley Tucci as Mitchell and Hope Davis as his wife Therese, appeared in 2000.

Why all the fuss about a New Yorker writer who published virtually nothing during his final 32 years? Mitchell had received an unusual degree of attention since 1943, when the literary critic Malcolm Cowley deemed him “the best reporter in the country,” at least at “depicting curious characters,” in The New Republic. Comparing his characters to those of Dickens, Cowley explained Mitchell’s basic method: He “likes to start with an unimportant hero, but he collects all the facts about him, arranges them to give the desired effects, and usually ends by describing the customs of a whole community.” In 1965, the critic Stanley Edgar Hyman wondered whether the claims made on Mitchell’s behalf by Cowley and others were too modest. He argued that Mitchell was no ordinary magazine writer, “a reporter only in the sense that Defoe is a reporter, a humorist only in the sense that Faulkner is a humorist.” According to Hyman, Mitchell was involved in a more ambitious philosophical project: exploring existential imponderables like “human dignity,” “fertility and resurrection,” and “the depths of the unconscious.”

Mitchell did so in a plainspoken, deliberately unfussy prose style. Kunkel accurately describes it as “the kind of prose that non-writers might have assumed was easy but that professionals knew was anything but.” Mitchell’s long, carefully constructed, fact-laden sentences often culminated in lists, which he used for both their musical effect and the authority they conveyed, whether he was describing New York Harbor’s sea life (“clams on the sludgy bottom, and mussels and mud shrimp and conchs and crabs and sea worms and sea plants”) or the gravestone carvings in a Staten Island cemetery (“death’s-heads, angels, hourglasses, hands pointing upward, recumbent lambs, anchors, lilies, weeping willows, and roses on broken stems”). “Setting these objects side by side in a row has an effect that is both as plain as Shaker furniture and as expansive as a cinematic tracking shot,” writes Luc Sante, one of the many contemporary writers influenced by Mitchell’s prose and outlook. Kunkel notes that “Mitchell stories may not have much plot, as such; the ‘action’ more typically involves human beings revealing themselves to us, bit by bit, usually in their own words, until we become privy to their innermost feelings and impulses.” Mitchell lays down sentences the way a master brick-layer builds a wall: deliberately, one increment at a time, allowing each row sufficient time to set. “I do believe that the most commonplace words are the ones that in the end have the most power,” he writes in an unpublished journal. “I’ll search endlessly for the right small words of a few syllables that hold something up. A foundation.”

* * *

TheNew Yorker was America’s first urban magazine. It was founded by Harold Ross (the subject of Kunkel’s previous book), the son of a Colorado silver miner and a schoolteacher. Ross was a high-school dropout who became a reporter in New York, where he had an idea for a magazine directed at novice metropolitans like himself. It would be a “reflection in word and picture of metropolitan life,” according to the 1924 prospectus. His insight was less demographic than aspirational. “You cannot keep The New Yorker out of the hands of New York–minded people, wherever they are,” announced a promotion for the magazine. “New York is not a tack on a map, not a city, not an island nor an evening at ’21.’ The New Yorker is a mood, a point of view.” And like The New Yorker‘s readers, most of its writers and editors came from elsewhere. “That very ‘otherness’ was key to The New Yorker‘s freshness and inventiveness, in that all those creative people were exploring their curiosity about New York within the magazine itself,” Kunkel writes.

Few were more curious about New York City than Joseph Quincy Mitchell, born in Fairmont, North Carolina, in 1908, to a family with roots going back to the Revolutionary War veteran Nazareth Mitchell. Generations of Mitchells farmed cotton, tobacco, timber, soybeans, and corn, and, while not wealthy, they were more comfortable than most in Robeson County, one of the poorest in the South. Joseph was the oldest son, and it was assumed he would eventually take over the family business, regardless of his inability to master the mathematics necessary to navigate the agricultural-commodity markets. He attended the University of North Carolina, where he became a devoted student of Stephen Crane, Dostoyevsky, Turgenev, and Joyce, whose novel Ulysses, then banned, Mitchell was able to read because a friend smuggled a copy into the country for him. Mitchell became a Joyce devotee, a lifelong member of the James Joyce Society, who named his daughter, Nora, after the novelist’s wife. Ulysses was not the sole object of Mitchell’s interest in Joyce. “The novel that I get down most often is Finnegans Wake,” he wrote in his journal. “I read it over and over, just as one of my grandmothers used to read the Bible. I am now reading it for the seventh time.”

While at college, Mitchell composed fictional versions of the “field sketches” for which he would later become famous. When an article he wrote about tobacco farming was published in the New York Herald Tribune, he decided to go to New York and try his hand at journalism. He received no encouragement from his father, who asked him, “Son, is that the best you can do, sticking your nose into other people’s business?” Mitchell would spend the rest of his life shuttling between New York and Fairmont, a self-described “exile,” consumed by guilt for leaving home and disappointing his father.

Mitchell arrived in New York in time for the 1929 crash, an experience that added to his psychological baggage. “Looking back on it, I think I got scared during the Depression and never got unscared,” he wrote in an unpublished memoir. New York had more than a dozen daily newspapers at the time, and Mitchell got a copyboy job at TheWorld, soon working his way up to a reporter’s position at the Herald Tribune and, finally, the World-Telegram, writing dozens of celebrity profiles of people like Eleanor Roosevelt, Kitty Carlisle, Jimmy Durante, and Tallulah Bankhead. It wasn’t long before Mitchell had become a marquee name, his stories regularly published on the front page, his picture featured on World-Telegram delivery trucks.

Mitchell’s literary journalism grew out of the work by late-19th-century muckrakers and novelists like Crane, Jacob Riis, and Lincoln Steffens. Crane, for one, thought nothing of chronicling the same incident in different genres, as he did when he wrote about being shipwrecked in a newspaper article, a short story, and a magazine piece. Crane used his novelist’s sensibility to render New York as a “mosaic of little worlds.” The historian Alan Trachtenberg wrote that while crusading journalists sought to convert “the reader to social sympathy,” Crane strove to turn “the sheer data into experience.” Taking his cue from Crane, Mitchell became a gifted listener who rendered the people in his stories with novelistic detail. “Mitchell coaxed his subjects with a great and animated enthusiasm, as if the secret to happiness or the meaning of life could be found in their sometimes-dreary monologues,” Kunkel writes.

Mitchell thrived at the World-Telegram, but his work wasn’t dramatically different from his colleagues’. That changed in November 1937, when he was assigned a six-part series on Franz Boas, the German-born Columbia professor of anthropology. The encounter was a “graduate-level seminar in anthropology that caused him to rethink, as a reporter, why people are who they are and do what they do,” notes Kunkel. As the interview proceeded, Mitchell realized that Boas was studying him, “a newspaper reporter,” much as an anthropologist might observe a member of a newly discovered tribe.

Boas was the father of what has come to be known as “cultural relativism”: the belief that societies can’t be ranked objectively, as was the pseudoscientific fashion of the time. He argued that differences between societies were explained by culture, not biology, and that as groups migrated, their traits merged and overlapped with those whom they encountered, resulting in what is today referred to as “hybridity.” Boas approached the societies he studied as subjects in their own right, possessing creativity and will. The anthropologist concluded the interview by giving Mitchell a copy of his book Anthropology and Modern Life. “Don’t take anything for granted, don’t take yourself for granted, or your father,” he advised the reporter. Mitchell left the encounter “feeling born again.”

Harold Ross hired Mitchell in 1938, assuming he’d get stories similar to those that had been appearing in the World-Telegram. In a sense, he was right, in that several of the people Mitchell had written about for the newspaper appeared in his early New Yorker stories as well. However, the encounter with Boas had altered his view of the world. “I began to see that I had written a lot of things that were highly dubious,” he recalled.

Here is his 1938 World-Telegram description of Mazie Gordon, the owner-operator of a seedy Bowery movie theater that provided a respite for men down on their luck:

She is known as “Miss Mazie” by the blighted men who exist in the walk-up hotels along the Bowery. Her real name is Mazie Gordon, and she is a blonde with a heart of gold. Her clothing is flamboyant, and she uses cosmetics with abandon.

Here is Mazie, two years later, in The New Yorker:

Sitting majestically in her cage like a raffish queen, Mazie is one of the few pleasant sights of the Bowery. She is a short, bosomy woman in her middle forties. Some people believe she has a blurry resemblance to Mae West. Her hair is the color of sulphur. Her face is dead white, and she wears a smudge of rouge the size of a silver dollar on each cheek…. “I got a public of my own, just like a god-damn movie-pitcher star.”

In the hands of Mitchell the anthropologist, Mazie becomes a willful, multidimensional character, not a stereotyped “blonde with a heart of gold.” Combining the tenacity of a fine reporter with the ethnographic insight of a social scientist, Mitchell discovered a perspective that wasn’t condescending or ingratiating, portraying his characters as neither victims nor heroes. “If the truth was known,” concludes Jane Barnell, the bearded lady whom Mitchell profiled in 1940, “we’re all freaks together.”

This is the Mitchell who inspired generations of writers by showing us how to observe something or someone without preconceptions, as if for the first time. In his hands, the intrepid urban reporter, simply by describing the scene with an air of sincere wonder, provides an oasis of ingenuousness in an all-too-knowing culture.

* * *

Getting hired by The New Yorker was both the best and the worst thing that happened to Mitchell. In order to provide its writers with something akin to a steady salary, the magazine had an unorthodox compensation plan according to which writers would draw money against future earnings, a system that left many of them working as indentured servants for years. The idea of being in debt so terrified Mitchell that he became the sole staff writer with a conventional salary, starting at $100 per week. At first, Mitchell maintained his World-Telegram level of productivity, publishing 13 pieces in 1939, half of which were fiction. Three of the pieces that made his reputation—”Lady Olga,” “Mazie,” and “The Old House at Home”—appeared in 1940.

He began to slow down in the 1950s, publishing only five stories. With his newfound sense of vocation, Mitchell spent more time reporting each of his stories. He began to suffer from depression, and with his small but regular New Yorker salary and revenue from the farm, there was little external pressure on him to produce. Everyone acknowledged that his work was getting even better, and some of his most probing, profound pieces appeared between 1950 and 1964, introducing the world to such memorable characters as Louis Morino, the owner of Mitchell’s favorite seafood restaurant, Sloppy Louie’s; Old Mr. Flood, the 95-year-old retired house wrecker who lived on a diet of fresh seafood, harbor air, and the occasional Scotch; George Henry Hunter, the chairman of the board of trustees of the African Methodist Church in Sandy Ground, Staten Island.

As the New York that Mitchell knew in the ’30s and ’40s began to slip away, a note of belatedness crept into his work, which had always possessed a healthy sense of nostalgia for the world he had come to know as a young man. His published work remained austere, but his private world grew more overwrought. “I collapsed inside with shame and with pure, unadulterated gazing-down-into-the-open-grave-as-the-coffin-is-lowered bitter choked-up scalding grief,” he writes when a mechanic informs him his car is beyond repair. The sections of Mitchell’s unpublished memoir that have appeared recently in The New Yorker are similarly fraught (“almost everybody has come to seem strange to me, including myself”), displaying the writer’s dark side with little of his humor. The self-doubt that dogged him has become crippling, and he fears that even his most famous painstakingly drawn characters are little more than “stereotypes.”

* * *

Today, the two most commonly asked questions about Mitchell are whether he made things up and why he stopped publishing. In his 1948 collection, Mitchell himself admitted that one of his most famous characters, Mr. Flood, was a composite. Kunkel reviews each of his stories with the thoroughness of a forensic detective, discovers a few more composites, but is unable to explain why Mitchell used them. A number of his early New Yorker pieces had been published as “fiction” rather than “fact,” so it was public knowledge that he was comfortable with both. That was clearly the case when he wrote about Mr. Flood, whom Mitchell explained was “not one man; combined in him are aspects of several old men who work or hang out in Fulton Fish Market, or who did in the past. I wanted these stories to be truthful rather than factual, but they are solidly based on facts.”

The last sentence was probably unnecessary. No one doubted Mitchell’s reporting acumen, but when book publishers and admirers were unable to find some of his characters, it raised questions about the existence of the rest. Part of the magic of Mitchell’s writings had always been the way that he inhabited his characters (he assigned Mr. Flood his own birthday, July 27; both ate little other than seafood), imbuing them with the wisdom, perspective, and knowledge that he possessed. But after he confessed to having created some of his characters, the long, seductive quotes attributed to others seemed suspicious. The New Yorker had published composite characters before, especially in its early days, when Ross thought of the magazine as more humorous than serious. Kunkel discovers that, according to Mitchell, it was Ross who suggested that he bring his beloved Fulton Fish Market characters to life in a composite—so, unlike more recent fabulists, such as Stephen Glass and Jayson Blair, the writer had the boss’s permission. And Mitchell feared that readers would conclude that all his characters were composites, which is why he added the note to the Mr. Flood collection. Today, composites are forbidden in all respectable publications, and there are journalism professors who won’t teach Mitchell. I think it makes more sense to think of Mitchell as an heir to Crane—a 19th-century man living in a 20th-century world.

As to why Mitchell stopped writing, Kunkel is less enlightening. Mitchell had been aware of Joe Gould since 1932, and in 1942 he pitched a profile of Gould as “a perfect example of a type of eccentric widespread in New York City, the solitary nocturnal wanderer,” adding that “that was the aspect of him that interested me most, that and his oral history.” When Gould learned that Mitchell wanted to profile him, he telephoned to greet him “at the beginning of a great endeavor.”

Little did Mitchell know that Gould would contribute to his undoing. As Kunkel writes: “In Gould, Mitchell found a near-doppelgänger. Like Mitchell, Gould had left behind his home and a disappointed father. Like Mitchell, Gould was a practiced listener.” It wasn’t until 1957, when Gould died, that Mitchell felt he could reveal the truth about his nonexistent oral history. “Joe Gould’s Secret,” the sequel to the 1942 profile, ran several times longer than the original and has a darker, more confessional tone than Mitchell’s previous work. In a long, melancholy passage, he describes the sprawling New York novel he had wanted to write when he was a young newspaper reporter. Mitchell’s mother died while he was in the middle of writing the second Gould piece, and his exhaustion is apparent in a letter to a North Carolina friend, in which he mentioned “a New Yorker Profile that I’ve been working on for what seems like the last three hundred years.”

* * *

The family business required that Mitchell spend more time in Fairmont in the 1970s and ’80s, but he continued to show up at his New Yorker office every morning when he was in town. Staffers would hear him typing away in his office, and their fascination reached the point where some would rummage through Mitchell’s trash can at the end of the day, in search of manuscript pages. Oddly, the standard answer to the question of why Mitchell stopped writing—that he was Joe Gould—was suggested by Stanley Edgar Hyman way back in 1965: “We realize that Gould has been Mitchell all along, a misfit in a community of traditional occupations, statuses, and roles, come to New York to express his special identity.” And Mitchell spoke freely about his relationship to Gould in a 1988 interview with the scholar Norman Sims. “To me a very tragic thing [about the Joe Gould profiles] is the story of so many people who bit off more than they could chew—and I’m one of them, you know…. Because he is me.” The difference between Mitchell and Gould, of course, is that the former died with a substantial body of work in which the latter exists only as a character.

One learns more about why Mitchell was unable to write from Janet Groth, a professor of English who became Mitchell’s confidante while a receptionist at The New Yorker in the late ’60s and early ’70s. Each week the two would take a long lunch at what remained of Mitchell’s favorite restaurants, discussing the writers they admired, like Joyce and Kafka. As they became closer, Mitchell told her more about the big book. “He told me…he had been trying to write for years—weaving into a seamless whole the passing of the old South, symbolized in the death of his father, and the passing of the old port-and-market New York.” Kunkel tells us that Mitchell decided to make himself the book’s protagonist, but Groth makes it clear that he was, finally, too much the self-effacing reporter to adopt the first-person voice that the New Journalists were experimenting with (and which he loathed), much less the confessional, first-person voice that publications like The Village Voice would give legitimacy in the ’70s. But if he didn’t use himself as a literary character, who, then, would carry such a sprawling, ambitious story? “Oh, Joe, what a cross you constructed for yourself, and how you crucified yourself upon it!” Groth writes. “It was as if Joyce had tried to write a day in Dublin and a day in Trieste.”

]]>https://www.thenation.com/article/the-most-eccentric-new-yorkers-and-the-writer-who-loved-them/The Plot Against Equalityhttps://www.thenation.com/article/plot-against-equality/Robert S. Boynton,Robert S. BoyntonDec 7, 2006The Trouble With Diversity challenges us to remove our race-tinted glasses and view the world in the class-based terms that, he argues, define it.]]>

Anyone who still believes in the reality of race ought to spend some time reading graduate school applications. Every year my department receives a few hundred, a growing portion from students who identify themselves as of “mixed race” or fail to check anything at all, leaving me to use my sleuthing skills for clues about their ethnic heritage.

I’m not alone. In the 2000 US Census, 7 million people, 40 percent of whom were under age 18, picked more than one racial or ethnic category for themselves. Between 1991 and 2001 the number of students in higher education whose race is officially “unknown” increased 100 percent. Americans still use the language of race to identify themselves–they just don’t agree about what “race” is.

Why do I spend so much effort trying to fit students into racial categories whose biological basis has been thoroughly discredited? According to literary critic Walter Benn Michaels, author of The Trouble With Diversity: How We Learned to Love Identity and Ignore Inequality, I’m engaging in a fruitless, even sinister, reactionary enterprise, one that distracts vital attention from the only social division that ultimately matters: class. How can I be certain that the “minorities” I’m admitting are truly disadvantaged, Michaels would ask, and not the children of the growing black middle class? And even if these minority candidates are economically disadvantaged, he’d continue, why do I assume that they, by virtue of their ethnicity, will bring more (or different) insights than other students?

Would that I could respond with the theoretical sophistication (though not the repetitiveness) of Michaels’s book. Alas, my answer is quite ad hoc and mundane. Given the pool applying to my expensive private university, I’ve found that race is a fairly reliable proxy for disadvantage (at least relative to the other applicants). While I’m always on the lookout for telltale phrases like “first in my family to attend college,” our application has no “poor” box to check, and virtually everyone requests financial aid.

Michaels is certainly correct that while in theory diversity includes such nonracial characteristics as geographic origin and economic status, they play a minor role in most calculations. In America, diversity is synonymous with race. In both the corporate and educational realms, the jargon of diversity has acquired a holy air. “Diversity has become a word that must be spoken,” Alan Contreras, the administrator of the Oregon Office of Degree Authorization, writes on the website Inside Higher Ed. “Those who don’t speak it in the right slightly breathless tone while looking both sorrowful and committed are unemployable.”

Diversity acquired its current meaning in 1978, when the Supreme Court (in University of California Regents v. Bakke) ruled that taking the race of an applicant into consideration was acceptable if it served “the interest of diversity.” Never simple to begin with, the civil rights-era methods of race-based affirmative action were translated into the amorphous language of multiculturalism.

The Trouble With Diversity is a bracing jeremiad, an all-out assault on the way identity in general, and race in particular, is used to organize society. It is also a thought experiment in which Michaels invites us to remove our race-tinted glasses and view the world in the class-based terms that, he argues, actually define it. For Michaels, there is no middle ground, no room for compromise: Race shoved class out of American consciousness, and he wants to reverse the situation. “We love race–we love identity–because we don’t love class,” he writes. The alternative is not to “love” class, since Michaels knows that class, unlike race, is distinctly unlovable. Class inspires no “National Museum of Lower-Income Americans on the Mall” in Washington, and no special holidays celebrating the culture of the poor (indeed, the “culture of poverty” is a sociological epithet); while some poor people inherit their poverty, we would all agree with Michaels that it would be perverse to think of it as their “heritage.” The only area in which we are sentimental about poverty is in studies of working-class culture and literature, in which class is considered a form of identity.

By lumping together the categories of race, class and gender–the holy trinity of academic cultural studies–and treating them as different but equal identities, we have decided to manage inequality rather than reduce, much less eradicate, it. For Michaels, this conceptual sleight of hand is nothing less than a crime.

The Trouble With Diversity is the most recent product of the movement the late cultural critic Ellen Willis dubbed “economic majoritarianism.” Readers of Thomas Frank, Todd Gitlin, Richard Rorty and others will be familiar with the thesis, according to which identity politics has led the left astray, miring it in endless cultural debates that sap the will and sacrifice elections. Only a renewed commitment to economic justice, the majoritarians argue, can revive it (a strategy vindicated, some argue, by the results of the midterm elections). Even literary critic Terry Eagleton has joined the anti-identity crowd, concluding his book After Theory (2003) with the charge that “cultural theory…cannot afford simply to keep recounting the same narratives of class, race and gender.”

Michaels’s approach is more philosophical than that of these politically minded critics. Where they cautiously distance themselves from divisive ideological positions, he embraces them: He calls for redistributing wealth, abolishing inheritance and doing away with race-based affirmative action. He doesn’t care about rallying a particular party or crusading for the underclass (“I’m not writing this book out of a passionate sense of identification with the poor,” he told The Chronicle of Higher Education), and he confesses that even his $175,000 salary isn’t enough to stifle his envy of the truly rich. His strategy is to reveal the flawed foundation on which the concepts of race and identity rest, in the hope that we therefore stop caring about them. “Treating race as a social fact amounts to nothing more than acknowledging that we were mistaken to think of it as a biological fact and then insisting that we ought to keep making the mistake,” he writes. “Maybe instead we ought to stop making the mistake.”

You don’t have to be a black person from the slums or a Native American raised on a reservation to recognize the naïveté of this “sophisticated” analysis–resting, as it does, on the premise that a logical deconstruction of a concept can neutralize the power with which history has invested it. Michaels isn’t a reactionary, but his quixotic faith in abstract reason is, as Orwell once wrote, “the sort of nonsense only an intellectual could believe.” In a sense, The Trouble With Diversity is really two books in one. The first is a smart, unsentimental polemic that thinks nothing of declaring the death of a language (like American Sign Language) or a culture (like Bolivia’s Aymara Indians) a “victimless crime.” Coupled with that is a second book that resembles one of those yearly late-night conversations in which freshman philosophy majors scrutinize, and swiftly “solve,” the problems of the world. Michaels is as right about the conceptual incoherence of racial/identity politics as he is wrong and facile about how one might go about alleviating them.

Michaels’s fondness for all-or-nothing reasoning first appeared in “Against Theory,” the 1982 essay he wrote with Steven Knapp. Theory had come to dominate literary studies, they complained, leading critics to view themselves as working with the kind of foundational, transcendental principles commonly associated with science or certain branches of philosophy. As an alternative, they advocated more practical, if less precise, approaches, such as New Historicism. No school of thought was exempt from their wrath. “If we are right,” they wrote, “then the whole enterprise of critical theory is misguided and should be abandoned.”

In The Gold Standard and the Logic of Naturalism (1987), Michaels solidified his reputation as one of the leading scholars of American culture with innovative readings of late-nineteenth- and early-twentieth-century novelists like Theodore Dreiser, Frank Norris and Nathaniel Hawthorne. With Our America: Nativism, Modernism, and Pluralism (1995), he rehearsed the ideas that come to fruition in The Trouble With Diversity. Our America is essentially a genealogy of American multiculturalism, beginning with the story of how authors of the 1920s–William Faulkner, F. Scott Fitzgerald, Willa Cather–contributed to a racial (and racist) basis for American identity. Whereas these authors conceived of their work as shifting from a racial to a cultural conception of identity–from stultifying superiority to a liberating diversity, in which all identities are equal–Michaels showed how their project was not as progressive as it seemed, and in fact lent itself to racial thinking of a kind they would have abhorred. At one point, even the Ku Klux Klan adopted the phrase “Difference Not Inferiority” as a slogan.

Furthermore, Michaels accused contemporary champions of postidentity theory–those who envisage identity as contingent, performative and fluid–of employing the very racial essentialism they oppose. The more we emphasize culture and diversity, he scolded, the more we become mired in race. We inevitably answer the question “What should we do?” in terms of “who we are”–an appeal to racial/ethnic identity. The quest for identity is a vicious circle in which one can never escape the nineteenth-century notion of race. “For racial identity to become a project, it must turn to culture; for cultural identity to become a project, it must turn to race,” he wrote.

In The Shape of the Signifier: 1967 to the End of History (2004), Michaels attacked the identity politics of those who believe they are ineluctably connected to events–slavery, the Holocaust–they never experienced. The notion that the past “belongs” to anyone is absurd, he argued. “Why should slavery and apartheid be compensable,” he asked in a discussion of the reparations movement, while “free but poorly paid labor is not?” His claim wasn’t only that all affirmations of connection to the past are futile but that anyone who makes them is a foe of economic justice. “It’s one thing to celebrate Black History Month; it’s another thing to redistribute wealth. And, in fact, the two things are not only different, they are, in crucial ways, opposed,” he wrote. Race isn’t merely trumped by class in the world according to Walter Benn Michaels, it is, bizarrely enough, obliterated by it.

That we must choose between a society concerned with race and one concerned with economic inequality is the cornerstone of Michaels’s project. But must we? And even if we must, is it really so obvious that the evils of economic inequality always trump those of racism (or that the two can be so neatly disentangled)? It is telling that Michaels never feels the need to formulate an argument for the superiority of a class-oriented society. It is just assumed; all of his energy goes into debunking race. As a result, The Trouble With Diversity has a relentless, monomaniacal tone, its author marshaling more and more evidence to prove that considerations of race are never more than a ploy to avoid confronting poverty.

But there is something perverse about the way Michaels looks at America, as if it were little more than a university writ large. In fact, most cases in which race-based affirmative action comes into play do so in a fashion that has little to do with the way it influences higher education. There are plenty of spheres (civil service, the military) in which affirmative action has reduced economic inequality–evidence that there are occasions when society can simultaneously tackle racial injustice and look out for its less fortunate members. According to the 2005 Annual Review of Sociology, the wages of both black men and women rose during periods when affirmative action laws were vigorously enforced. Outside the university, Michaels’s either/or choice isn’t always so stark.

At a time when public school segregation is returning to Brown v. Board of Education-era levels, particularly in urban areas, I find it hard to believe that race is as unimportant as Michaels believes it is. Studies that take their inspiration from the racial and class composition of Harvard’s freshman class are bound to end up prisoners of their own (class?) assumptions. Had Michaels spent more time pondering the world beyond the campus walls, he might have reconsidered his assumption that class-conscious, rather than race-conscious, societies are more likely to prize economic equality. What to make of countries like India, Indonesia and France, to mention only a few? Have their attempts to diminish consciousness about identity among their citizens unleashed a surge of concern for economic equality?

The greatest virtue of The Trouble With Diversity is the tenacity and precision with which Michaels dissects our muddled ideas about race and identity. Our obsession with identity has stifled not only the discussion of economic inequality but of politics itself. When “the debate we might have about inequality…becomes a debate instead about prejudice and respect,” he writes, “we end up having no debate at all.”

Toward the end of his memoir, My Brother’s Keeper, Amitai Etzioni recounts meeting with the political consultant Dick Morris. Morris has recently been banished from the White House for consorting with an eavesdropping, toe-sucking prostitute (Or was he the sucker? I can never keep that straight.) Morris wants to co-write a memo on communitarianism with Etzioni with which to wheedle his way into Al Gore’s good graces in time for the 2000 election. Morris is in rehabilitation mode, and in the middle of their meeting, the multitasking Machiavelli conducts a brief phone interview with a Christian radio station. “Yes there was a spiritual vacuum in the White House…. I contributed to it.” When asked what religion he practices, Morris replies that he plays the field. “I work for both parties. Would you expect me to sign up with one religion? I atone in all of them.”

The anecdote is puzzling in more ways than one. What leader of a moral social movement would bother to meet with an apparatchik like Morris, especially after he had been so thoroughly disgraced? Even more puzzling is why Morris would want to meet with Etzioni, the “guru” of the nebulous communitarian movement. I would suggest that the answer to the first question tells us a lot about Etzioni, while the answer to the second reveals the secret of communitarianism’s success in the early 1990s.

The Zelig of public intellectuals–Time magazine once dubbed him “The Everything Expert”–Etzioni has weighed in on nearly every major policy debate of the past forty years. In addition to founding the communitarian movement, he has written several well-regarded academic texts and dabbled in politics. He was a senior adviser in the Carter White House (where he learned “the difference between being a public intellectual out on the hustings as compared to being a member of the king’s entourage”) and has been credited with influencing the centrist, “Third Way” positions of the Democratic Leadership Council, and of Bill Clinton and Tony Blair.

Etzioni’s disparate projects coalesced in 1990. The central idea of communitarianism–which he expounds on in The Spirit of Community: Rights, Responsibilities and the Communitarian Agenda (1993), The New Golden Rule: Community and Morality in a Democratic Society (1997) and Next: The Road to the Good Society (2001)–is that a society must balance liberal rights with community responsibilities. Harvard Law professor Mary Ann Glendon has likened communitarianism to “democracy’s environmentalist movement, helping to heighten awareness of the political importance and endangered condition of the seedbeds of civic virtue.” Communitarians conceive of society as a three-legged stool, held up by the forces of the state, the market and, yes, the community. They are skeptical of the rights-oriented, legalistic, interest-group politics of the liberal state. A precursor to the “third way” movements of recent years, communitarians want to “leapfrog the old debate between left and right and focus on the role of the community, culture, and virtues rather than on either the private sector or the government,” Etzioni writes.

A nice theory, but how does a movement that eschews “politics as usual” (rights, laws, etc.) go about changing societal values? Rather than coercing behavior via laws, communitarians advocate using one’s “moral voice” to persuade fellow citizens through shame and appeals to community norms. “Communities lead with their moral voice, appreciating those who act responsibly, and chastising those who do not,” Etzioni writes.

The movement’s policy recommendations run the gamut–depending on your perspective–from innocuous do-gooderism to authoritarian intrusiveness. Communitarians advocate mandatory national service, campaign finance reform, the two-parent family, sobriety checkpoints and drug-testing for people (like engineers and pilots) whose jobs give them tremendous public responsibility. More controversially, communitarians advocate covenant marriages–in which a couple agrees to participate in marital counseling and delay divorce for a few years if one partner files for it–as a way to “encourage” (a word communitarians use a lot) family stability. Furthermore, they argue that people at high risk for HIV should be tested, and if the results are positive, should be encouraged to inform their previous and prospective sexual partners.

While not on the same philosophical level as the communitarian writings of Michael Walzer, Michael Sandel, Alasdair MacIntyre and Charles Taylor, Etzioni’s work has provided the most sustained institutional response to the liberal individualist project, of which John Rawls’s A Theory of Justice is the most powerful expression. Much like the neoconservative movement from which he says he takes inspiration, Etzioni is more interested in influencing policy than philosophy. To paraphrase Irving Kristol, Etzioni is a liberal who was “mugged” by community.

As the age of Reagan waned, Etzioni sensed that

a reaction was setting in to the excessive individualism that neoconservatives and their followers helped to foment. The country was yearning for a less one-sided way of thinking, a third way between the worshipers of the market and state-hipped liberals and an approach that would not ignore core social-moral values. To effectively respond, the American renewal project needed a school rather than merely individual thinkers, each working on his or her own.

To this end, he founded a journal, The Responsive Community: Rights and Responsibilities, which is published under the auspices of the Institute for Communitarian Policy Studies of The George Washington University, where Etzioni is a university professor. In 1991 he published “The Communitarian Platform,” which announced the movement’s principles. And he surrounded himself with an impressive group of thinkers: Mary Ann Glendon, an expert on family law and civil rights; William Galston, a political scientist at the University of Maryland, who was issues director for Walter Mondale’s 1984 presidential campaign and Deputy Assistant for Domestic Policy under Clinton; the political theorist Benjamin Barber; the ethicist James Fishkin. Over the past decade, Etzioni & Co. (the names above, as well as Robert Bellah, Jean Bethke Elshtain, Nathan Glazer, Martha Minow and others) have churned out hundreds of books, monographs, position papers and editorials, while also sponsoring dozens of panels and conferences across the country.

In a sense, the story of communitarianism is the story of Amitai Etzioni–a story that might itself have been lifted from the pages of a Leon Uris novel. Smuggled from Germany in 1934, the 5-year-old Werner Falk (as Etzioni was born) moved from Italy to Greece, and finally, in 1937, to Haifa, where he thrived in the communal life of a kibbutz. His father fought against Hitler under the auspices of Great Britain’s Jewish Brigade, and young Amitai–his adopted name was created by fusing the Hebrew words “truth” (emet), “tree” (etz) and “Israel” (Zion)–did his part by smuggling fleeing European Jews into Palestine. In March 1947, at age 18, he attended a meeting where David Ben-Gurion announced, “The time is right for us to take the ultimate risk and demand and fight for the formation of a full-blown state.” Writing in his diary, Etzioni wavered between dedicating his life to “riches and status” or to “service for the common good.” Soon after, he quit school and led a platoon in the Palmach (the commando unit of the Haganah) in the fight for Israel’s independence. (Two-thirds of his unit were either killed or wounded defending Jerusalem.) His first book, A Diary of a Commando Soldier (a collection of newspaper columns he wrote during the war), was a bestseller. The 21-year-old Etzioni had discovered his public voice.

Lacking a high school diploma, Etzioni had difficulty finding a university that would admit him after the war. Luckily, Martin Buber was looking for students to attend his new institute. In addition to studying Kabbalah with Gershon Scholem, Etzioni worked with Buber himself, and was profoundly influenced by the philosopher’s notion of “dialogue” (“a give-and-take during which people open up and reach each other profoundly”), as well as his famous distinction between “I-Thou” relationships (treating others as fellow human beings) and “I-It” relationships (treating others as objects). Later, while studying sociology at Hebrew University, Etzioni discovered the concept of “anomie,” the condition of spiritual aimlessness that Durkheim argued was the result of modernity’s loss of social fabric. The essential concepts of communitarianism were in place.

Etzioni arrived in Berkeley in 1957. Eighteen months later–having received a PhD in sociology in record time–he landed a job at Columbia, home to the legendary sociologists Paul Lazarsfeld and Robert Merton, as well as the iconoclastic C. Wright Mills. Although Etzioni’s scholarly work was well within the confines of 1950s quantitative social science (his first book was titled A Comparative Analysis of Complex Organizations), his activism soon got him into trouble with his more “value neutral” colleagues. After Lazarsfeld called him onto the carpet for writing a review of the art film Hiroshima Mon Amour (“The last thing we need is another C. Wright Mills,” he scolded), Etzioni concluded that the “images from Hiroshima sufficed to confirm my belief that humanity could not possibly do without my administering to it.” He vowed to become a scholar and an activist, a public intellectual. “Here I stand; I can do no other,” he announced to his wife, invoking Luther.

It is in the sections in My Brother’s Keeper on his career as a public intellectual that Etzioni’s memoir is most revealing. He is clearly a man who revels in the wonkish public-policy battles that take place beyond the seminar room (which was one of the reasons he moved to DC’s George Washington University). And he is clearly a man of conviction–the “conviction” typically being that he is correct, his ideas on “the side of the angels.” But it isn’t enough to be right; it drives him crazy when he isn’t credited for his insights. He writes, “I was pissed about those occasions in which I had been able to see around the corner–on matters of considerable public import–but for which my observations were not appreciated. Why do many people find such claims so annoying?” (I wonder.) Unlike most public intellectuals, Etzioni isn’t satisfied merely to pen the occasional editorial, write a crossover book and appear on television. He yearns for “the special high that I tasted in Israel, of participating in a project that was larger than life, greater than self. For accomplishments that changed something in the real world.”

Etzioni’s memoir is replete with examples of his intellectual activism. In 1962 he implores Martin Buber to ask Pope John XXIII to defuse the Cuban missile crisis by placing calls to John F. Kennedy and Fidel Castro. Etzioni opposed the Vietnam War (Winning Without War, 1964) and railed against the diversion of money from the War on Poverty to space exploration (The Moondoggle: Domestic and International Implications of the Space Race, 1964). In 1968 he and his colleagues formed a human chain around one of the buildings occupied by protesting students at Columbia University, where he was a sociology professor. His 1973 critique of the budding bioengineering movement (Genetic Fix: The Next Technological Revolution) was nominated for the National Book Award. More recently, he has weighed in on campaign finance reform (Capital Corruption: The New Attack on American Democracy, 1984), privacy (The Limits of Privacy, 1999) and race (The Monochrome Society, 2001).

Academic departments are often ranked by the number of times faculty work is cited in scholarly journals, and the pecking order of policy advisers depends on the power of their advisees. But how is one to gauge a public intellectual’s effectiveness? In Public Intellectuals: A Study of Decline (2002), Richard Posner analyzed the “public intellectual market” using data provided by Amazon.com rankings and popular press citations. Posner considers the public intellectual a kind of intellectual soothsayer who should be judged by how often his “predictions” hit the mark (utopians need not apply). I didn’t think anyone took Posner’s suggestions seriously until I read Etzioni’s memoir. In it, he provides exhaustive accounts of the reviews his books receive, as well as summaries–and quantitative analyses–of communitarianism’s press coverage. For Etzioni, the medium is the message. He believes that evidence of communitarianism’s influence lies in the number of times its “keywords” appear. “In the 1990s, the phrase ‘rights and responsibilities’ appeared some 6,183 times in the top fifty newspapers alone,” he writes. These were Etzioni’s Golden Years. “For months on end, we seemed to be on the front of most everyone’s Rolodex.”

The second half of his memoir is a cautionary tale for all would-be public intellectuals. I suspect Etzioni never recovered from his stint as a “member of the king’s entourage” in the Carter White House, and has spent much of his energy since then trying to whisper in the ear–any ear!–of those in power. “The trick was to find ideas that were both honestly communitarian and not impolitical.” Some trick, indeed.

While Etzioni’s dealings with Clinton and Blair have been well documented, in his memoir it becomes clear that Etzioni’s criteria for offering advice depend less on ideology than access. We witness him regress from a passionate intellectual to a Loman-esque figure, desperately hawking his communitarian wares to anyone who will listen. He tries to sell communitarianism to Helmut Kohl (not interested), Bob Dole (“There was no sign of their Christian spirit, that of reaching out and caring for vulnerable members of the community, which is so much a part of the values they were anxious to uphold.” Shocking!), and George W. Bush (“His tone and demeanor were often soft and conciliatory; that is, communitarian”). Etzioni implores Janet Reno to rethink her commitment to the Fourth Amendment (she demurs).

Etzioni’s media profile faded in the late 1990s. The communitarian message didn’t feel so fresh, and some of its policies seemed downright creepy. Despite Etzioni’s embrace of Buberian “dialogue,” his presentations felt more like monologues: No matter what the subject, “balancing rights and responsibilities” was always the answer. In 1994 the Guardian asked, “Is Etzioni just a Jerry Falwell in cap and gown? Could communitarianism be a thinking person’s Moral Majority?” Etzioni dutifully records that the movement’s media citations peak in the mid-1990s. “By the late 1990s, there were more and more days, then weeks, when no one called. Invitations to speak and to attend conferences ceased to pose scheduling problems; there were no longer any who wanted me to be in two places at the same time.”

Much of the difficulty had to do with his “third way” communitarian message. The political blood-sport of the Clinton era made Etzioni’s plea for nonpartisanship sound naïve, if not disingenuous. If Clinton could gut welfare while simultaneously praising communitarianism (“You are my inspiration,” Clinton told Etzioni one New Year’s Eve), maybe the movement was more style than substance. Were communitarian ideas merely protective coloration for politicians of the left and right? Was a movement admired by Bill Bennett, Dick Morris and George W. Bush itself worth admiring?

And the more closely people considered Etzioni’s proposals, the more it became apparent that many were either stunningly obvious (“If the advocates of civil rights and those of public safety would stop butting heads, we would see all kind of ways to advance our security while minimizing intrusions on our liberty”) or absurdly utopian (a “megalogue” on values between members of a super “community of communities”). Wish-and-make-it-so public policy.

I think the reason communitarianism never had the impact of, say, neoconservativism has to do with its message as well as its method of implementing its ideas. Communitarianism speaks the language of reform, not revolution. It seeks to temper the primacy of the individual, to tame the logic of the market, to alleviate our reliance on government and its laws. It is more “liberalism rightly understood” than an ideology in its own right. Etzioni is less a prophet for a new idea than a publicist for a worthy, but not particularly novel, point of view.

Toward the end of My Brother’s Keeper, he describes his relations with the Clintons: Hillary, who cites him in It Takes a Village to Raise a Child, and Bill, who casually leaves a copy of Etzioni’s book, The Spirit of Community, on his Oval Office desk during a visit by the press. It “was one more sign, to put it grandly, that these ideas were in step with history.” Not ahead, and not behind. And that, ultimately, is the problem.

When I was an editor at Harper's, I would regularly receive essays from professors hoping to reach beyond the boundaries of their disciplines and communicate with a wider public. Although I confess I opened many of these submissions with a sense of dread, more often than not I was pleasantly surprised by their eloquence and relative accessibility. Contrary to the old saw about academics and impenetrable prose, most of these writers knew how to wear their learning lightly, and their essays were a testament to the proposition that clear thinking and good writing are as likely to be found within the university walls as beyond them.

Occasionally the results were not so happy. I recall one piece by an ambitious young scholar whose prose, he assured me, was 100 percent jargon-free. And, sure enough, it was. The problem, however, was that while he had diligently expunged words like deconstruct, hegemony and problematize–I suspect his computer came equipped with a "find and replace jargon" function–their conceptual ghosts remained. When stripped of his theoretical armor, he limped along unimpressively in an intellectual no-man's land and didn't, it became apparent, have much to say.

Like many academics in recent years, he was consumed by the desire to become a public intellectual. At one point the frenzy for relevance got so out of control that an English professor with a letter to the editor in the local newspaper might boast of having made a "political intervention." (I heard of one professor who had the PI honorific embossed on his business card.) With the proliferation of outlets like cable television and the Internet, intellectuals generally have less difficulty reaching the public than they once did. A trickier task is attracting an audience while maintaining intellectual credibility. Whether motivated by jealousy or dismay, one's colleagues may not look so kindly on one's newfound vocation. If being a public intellectual has never been easier, remaining a private intellectual has never been more difficult.

One might read Marjorie Garber's new book, Academic Instincts, as a meditation on this tension. "In their heart of hearts, scholars long for public and even popular recognition. The Holy Grail of the 'crossover book,' one that impresses one's colleagues but also appeals to the intelligent general reader and perhaps even makes the best-seller list, is a recurring dream in the profession," she writes. The William R. Kenan Jr. Professor of English and director of Harvard's Humanities Center, Garber knows whereof she speaks. Recently described by the New York Times as "one of the most powerful women in the academic world," Garber divides her books between cutting-edge presses like Routledge and commercial houses like Random House and Simon & Schuster (which paid a $180,000 advance for Vice Versa, her study of bisexuality). She is an extremely prolific and often graceful writer whose work appears in The New Yorker, the New York Times and The London Review of Books.

The author of three well-received scholarly studies of Shakespeare and a half-dozen works of eclectic criticism, Garber is the reigning queen of cultural studies. Whether opining on cross-dressing, bisexuality, the erotic relationship between sex and real estate–or between dogs and their owners–Garber is so compulsively witty, so imaginative and wide-ranging, that she raises intellectual improvisation to an art form. She is the dinner guest every hostess covets, the indefatigably charming conversation partner who, no matter how obscure the topic, keeps things going.

Garber established her modus operandi in her first crossover book, Vested Interests: Cross-Dressing & Cultural Anxiety (1992). "The tendency [of the critic] has been to look through rather than at the cross-dresser, to turn away from a close encounter with the transvestite, and to want instead to subsume that figure within one of the two traditional genders," she writes. Not Garber. Never having seen a distinction she couldn't subvert, she conducts a properly transgressive analysis of cross-dressing, swiftly dispenses with the false sexual binarity separating the concepts of male and female, and declares victory. And what a victory it is. According to Garber, transvestism represents a category crisis–not only for human sexuality but for the very notion of a category itself. Once considered little more than a cultural oddity, in Garber's hands cross-dressing prophesies nothing less than the end of epistemology. The New York Times praised the book as "a provocative piece of cultural criticism." The Holy Grail was hers.

With Vice Versa (1995), Garber ratcheted things up a notch by exploring the false binarity in the "eroticism of everyday life," the Ding an sich of cultural studies. If her strategy seemed somewhat familiar–bisexuality is "an identity that is also not an identity, a sign of the certainty of ambiguity, the stability of instability, a category that defies and defeats categorization"–her subject felt more substantial. "Is bisexuality a 'third kind' of sexual identity, between or beyond homosexuality and heterosexuality? Or is it something that puts in question the very concept of sexual identity in the first place?" she wondered.

If nothing else, Garber's lively romp through the lives of cultural figures who one had thought were steadfastly straight or gay–John Maynard Keynes, Harold Nicholson, John Cheever, Leonard Bernstein, Marlon Brando, Erik Menendez, Georgia O'Keeffe, Frida Kahlo–was extremely entertaining. Although her overheated prose sometimes resembled fashion magazine advertising copy ("Borderlines are back: Ethnic, racial, religious, and sexual minorities assert their visibility and, thus, their power"), her point was serious. After all, the world is full of sexual and romantic entanglements that defy standard categories; sexual identity does seem to operate along some kind of a continuum. Reading Vice Versa, one suspected that bisexuality–even if not present absolutely everywhere, as Garber intimated–was surely vastly underrepresented in a world bound by the assumptions of identity politics. While Garber's thesis was not particularly radical (Gore Vidal quipped that it was "about three centuries overdue") and her reasoning occasionally flimsy (does a single heterosexual dalliance really transform a lifelong homosexual into a bisexual?), these were important issues about which she was genuinely concerned.

Unfortunately, things went downhill from there. Two of Garber's later works, Dog Love (1996) and Sex and Real Estate (2000), are the kinds of literary follies men of leisure might write on a dare. Though prodigiously researched and fluently written, neither offers an argument for anything beyond its author's intellectual ingenuity. Writing in The New Republic, Zoë Heller described Sex and Real Estate as "so serenely silly–so untroubled by any whiff of a serious idea–as to invite a kind of awe." Sentiments like these have made Garber, along with Berkeley's Judith Butler and NYU's Andrew Ross, the whipping girls and boy of cultural studies.

Garber doesn't identify it as such, but Academic Instincts is clearly a response to her critics. The book is advertised as an exploration of the pleasures and pitfalls of the academic life that opens the door to an important nationwide and worldwide conversation about the reorganization of knowledge. Published by the staid Princeton University Press, sporting a cover illustrated by Raphael's School of Athens (although "digitally enhanced" with a photo of Garber posed with her golden retrievers, Wagner and Yofi), the book is a valiant attempt to convince her colleagues that she can do the job after all. Although filled with her standard potpourri of pointed observations and illuminating examples, the text positively bristles with arguments. But after describing the book in her preface as an analysis, an intervention and a credo, Garber hastens to add that it is also a love letter. And indeed, although seemingly structured with the precision of a work of analytic philosophy (three slender chapters on persons, institutions and language), Academic Instincts is ultimately informed by Garber's favorite psychoanalytic theme: the ineluctable desire everything has for its opposite–a process during which it, inevitably, subverts itself, thereby undermining the very distinction it sought to overcome.

Garber navigates her tripartite structure brilliantly, ferreting out traces of desire in every corner of the dusty academy. Professors are jealous of amateur thinkers' independence (and vice versa); each academic discipline covets its neighbor's superior insights (literary studies envies philosophy, which in turn envies law and/or science); and on the level of language, each discipline attempts to create a technical vocabulary specific to its area of expertise (a k a jargon), while at the same time longing for "a universal language understood by all."

According to Garber, these feuds are essentially unstable, rife with a "doubleness" that precludes any side from ever triumphing over another. As with cross-dressing, and bisexuality before it, Garber's point in Academic Instincts is that we should not–we cannot!–help but look beyond the false binarity of these intellectual constructs and appreciate the exhilarating cacophony of "the conversation of mankind," a phrase she borrows from Richard Rorty (who is, in turn, echoing Michael Oakeshott). "The point is not to choose the right inflection for each term but to show how intellectual life arises out of their changing relationship to each other," she writes. More succinctly, Garber's point is never to choose anything.

With Raphael's School of Athens as her talisman ("a transcendent, multitemporal, interdisciplinary moment in which everything in intellectual life is in the process of being discussed, negotiated, and remade"), Garber does what she does best: she champions a relatively uncontroversial thesis–human sexuality is multifarious, dogs are man's best friend, people love their homes, vigorous discussion enhances intellectual life–with a panache that makes it feel at once daring and completely palatable. (I wouldn't be surprised if Harvard's wizened Board of Trustees slipped in a copy of Academic Instincts with its next request for alumni donations.)

While her point may not be novel, the way she argues it is fascinating, both for itself and for the style of thinking it represents–a style that has become all too typical in cultural studies. Imagine Garber's mind as a kind of intellectual black box from which every either/or proposition that enters exits in the form of a both/and conclusion. Once inside the black box, the either/or proposition is processed through a maze of checkpoints–fake segues, tendentious comparisons, deceptive syllogisms, overbroad generalizations, misleading historical precedents, witty wordplay and sheer chutzpah–before being spit out as elegant yet inoffensive soundbites in the conversation of mankind.

"When you stop and think about it…" is one of Garber's classic introductory phrases–a rhetorical sleight of hand whose effectiveness depends precisely on the reader's not pausing to consider the validity of the what follows. Then, before you can think, Garber is off and running, burying the reader under a mountain of "evidence," occasionally pausing to reload ("It is interesting to note," "It is interesting to recall") before continuing the onslaught.

Take, for example, her argument that the professional wants to seem like an amateur, since amateur status is thought to guarantee virtue. "Politics is a dirty business, and a professional politician an object of suspicion. Better to have a background in something, almost anything, else," she writes. "Like sports, for example," and we're off on the trail of athlete-turned-politicians Bill Bradley and Jack Kemp. "Or consider, at least in the state of California, politicians from the world of entertainment," which is followed by a consideration of Ronald Reagan and Clint Eastwood.

But if you really do stop and think about it, Garber's examples usually fall into two categories: the trivially true and the false. American politics has always been a profession one joins from the outside. Unlike law or medicine, no advanced degree is required, so every new officeholder is an amateur, an outsider–whether he comes from the world of sports, entertainment or business. Therefore being an amateur is a necessary, but not a sufficient, requirement for becoming a politician. Generally, it helps to have been successful in your previous profession; failed actors and mediocre athletes don't tend to get very far in politics. While a candidate's amateur status is hardwired into the structure of the political system (although some wield their "outsider" credentials better than others), the far more important factor, which Garber willfully overlooks, is "success."

The last and most argumentative chapter of Academic Instincts is about jargon, which Garber refers to as "Terms of Art." Jargon is the cultural theorist's Achilles' heel, the point at which the tension between the public and private intellectual is greatest. The purpose of jargon is to make intradisciplinary communication more efficient. Thus, jargon is only a problem for the specialist who wants to cross over and speak to noninitiates about his field. Who has ever criticized a chemist for communicating with fellow chemists in the language of the periodic table, or mathematicians for speaking with algebraic or geometric terms? Objections like these literally don't make sense.

So why don't would-be public intellectuals–professional academics who covet the breadth and audience of an amateur–simply eschew their disciplinary jargon? The reason is that jargon actually plays a double function; as the linguist Walter Nash writes in Jargon: Its Uses and Abuses (1993), it is not only "shop talk" but also "show talk," a means of impressing, sometimes mystifying, the uninitiated. The funny thing about jargon for academic public intellectuals is that it is something–as the old saying goes–they can neither live with nor live without. It makes them both understood (within their profession) and not understood (outside it). And this "intelligibility gap" is the very essence of modern professionalism. Without it you're just another thinker, autodidact or generalist who is at home nowhere and everywhere. With it, you're a fully credentialed specialist, the credibility of whose pronouncements is augmented by your disciplinary and institutional prestige. As with all either/or propositions, Garber wants to have jargon both ways: as a sign of marginality (and hence moral superiority), and of professional expertise.

According to cultural historian Peter Burke's introduction to Languages and Jargons (1995), the word jargon has an extremely long and wide-ranging history. A medieval word, originally found in Provençal and French in the twelfth and thirteenth centuries, Chaucer used it to describe the twittering of birds. In the fifteenth century, it indicated the language of a marginal or foreign culture (Kafka later called Yiddish jargon). By the sixteenth century it meant gibberish (gargle and jargon are derived from the same root). In an odd twist, there was even a period when it could be used to specify a form of intercultural lingua franca. But by the early eighteenth century, it took on its primary modern meaning as the vocabulary of the professions. Hence the Oxford English Dictionary defines jargon as "any mode of speech abounding in unfamiliar terms, or peculiar to a particular set of people, as the language of scholars or philosophers, the terminology of science or art, or the cant of a class, sect, trade or profession." With the rise of industrial society and the proliferation of professions in the nineteenth and twentieth centuries, there was a veritable explosion of jargons.

This complex history makes jargon a perfect candidate for Garberian analysis. There aren't many shades of meaning that haven't been ascribed to the word in the past 800 years–an ambiguity that Garber uses to suggest that what passes for jargon today "is in the ear of the listener."

Although exhaustively argued, Garber's defense of jargon is relatively simple. Since language is a living thing, yesterday's jargon words may very well be today's normal or standard speech. Then comes the deluge: For Adorno, author of The Jargon of Authenticity, the words "authenticity," "genuineness," "transcendence" and "belief" were jargon. Orwell considered "romantic," "plastic," "values," "human," "dead," "sentimental," "natural" and "vitality" to be jargon. ("It would be interesting to know what kind of response Orwell might have had to the movement that has grown up in his name," Garber asks coyly.) Shakespeare alone introduced more than 1,500 words, including "label," "lapse," "dialogue," "design," "accused," "addiction," "rival" and "anchovy" into written English. "Could we imagine doing without them?" Garber asks.

Well, certainly not "anchovy," although we could probably make do without most of the others. But isn't Garber's suggestion a straw man? Who in his right mind argues that we should dispense (her word) with them? Words like "values" and "belief" are indeed vague and clichéd, the kinds of dead language that good writers avoid in order to keep their work from feeling stale. But are they jargon in any recognizable, modern sense of the definition?

Quicker than you can say Aufhebung, Garber has surveyed jargon's linguistic history, broadened its definition, resolved the tension between technical and nontechnical language and transcended what she calls the "paradox of jargon." She does this by constructing a syllogism so slippery it would make Socrates blush, and then dares us to question her faulty logic: (1) Jargon encompasses two conflicting kinds of language: the technical and the banal. (2) Jargon is any kind of language that has been overused and now substitutes for thought. (3) Neologisms, because they are invented to suit the specific needs of thinking, are the only words that aren't jargon. (4) But since neologisms are precisely the kinds of words that are most frequently recognized as jargon… (5) Then all language is jargon.

With its pristine pedigree, jargon turns out to be the public intellectual's best friend–a friend whose moral power, curiously, comes from having all the right enemies. "Too stale; too new. Too foreign; too familiar. Too pedantic; too demotic. Too plain; too fancy. With all these contradictory strikes against it, clearly jargon must be doing something right," Garber writes. Jargon is everything and nothing, a sign of the certainty of ambiguity, the stability of instability, a category that defies and defeats categorization itself.

And this is Marjorie Garber's genius. Cheerfully embracing the irreconcilable, she beats her critics to the punch. What can you possibly say about a thinker who is so comfortable with intellectual incoherence, as long as it carries a whiff of subversion?