Another cog in the culture industry

June 29, 2004

Because I'm a regular reader of the print-edition of The New Republic, I read this article by Anne O'Donnell (subscription-only, unfortunately) long before I found it on-line. (In fact, this post encouraged me to find the web version in order to link to it.) I mention O'Donnell's article because it alerted me to the lecture that Helen Vendler recently gave as the 2004 Jefferson Lecturer in the Humanities.

Professor Vendler's lecture is available on-line. Her topic is the humanities, in particular the humanities as they are practiced at today's colleges and universities. Professor Vendler addresses various interesting questions throughout her lecture: What are the humanities to study? What value do they possess? How are they to influence students? And so on. She answers her questions via readings of several poems by Wallace Stevens.

Since I have devoted my adult life to the study of the humanities, I was glad that Professor Vendler discussed them in a public forum. (By the way, the Jefferson Lecture is sponsored by the NEH.) Prominent academics with a public profile don't do enough to defend and promote what they do for a living. So that's all to the good.

But Professor Vendler makes several remarks about philosophy (which is my discipline) that I find puzzling. I'll quote from her first three paragraphs and then say something about them.

Here are the first two paragraphs in their entirety:

When it became useful in educational circles in the United States to group various university disciplines under the name "The Humanities," it seems to have been tacitly decided that philosophy and history would be cast as the core of this grouping, and that other forms of learning--the study of languages, literatures, religion, and the arts--would be relegated to subordinate positions. Philosophy, conceived of as embodying truth, and history, conceived of as a factual record of the past, were proposed as the principal embodiments of Western culture, and given pride of place in general education programs.

Confidence in a reliable factual record, not to speak of faith in a reliable philosophical synthesis, has undergone considerable erosion. Historical and philosophical assertions issue, it seems, from particular vantage points, and are no less contestable than the assertions of other disciplines. The day of limiting cultural education to Western culture alone is over. There are losses here, of course--losses in depth of learning, losses in coherence--but these very changes have thrown open the question of how the humanities should now be conceived, and how the study of the humanities should, in this moment, be encouraged.

After these introductory remarks, she formulates her proposal in her third paragraph:

I want to propose that the humanities should take, as their central objects of study, not the texts of historians or philosophers, but the products of aesthetic endeavor: architecture, art, dance, music, literature, theater, and so on.

I'm all for the study of the arts, which is why I repeatedly taught courses in aesthetics when I was still a professor. It's what Professor Vendler says about the status of philosophy that puzzles me so.

But I'll start with a couple of minor quibbles. First, to say that philosophy embodies truth doesn't really distinguish it from history or most of the other humanistic disciplines, since, presumably, like philosophy, they aim at producing truths rather than falsehoods. I guess, though, that she means something suitably high-minded: Truth with a capital 'T', as Richard Rorty might say.

Second, Professor Vendler's observation that historical and philosophical assertions are made from particular viewpoints is obviously true. No one has ever really argued otherwise, although many historians and philosophers have thought  rightly, if you ask me  that they could overcome their own particular particularity, if you will, in their efforts to produce knowledge claims of various sorts. The point is to work within the boundaries of our fallibility, individually or collectively, in such a way that we minimize the risk of error and maximize the possibility of arriving at the truth. All of us do this all the time.

The real problem here, though, is that Professor Vendler seems to be asserting that the particularity of the viewpoint from which a claim to knowledge issues is itself a sufficient ground for challenging the view. This is the sort of mediocre epistemology that one finds all too frequently outside of philosophy departments.

The finitude of the person making a claim to knowledge is never a sufficient ground for challenging what that person has said, because such finitude characterizes everyone who makes a claim to knowledge. A challenge has to be based on reasons that directly address what has been claimed. Airy observations of the post-modernist sort that we are all historical beings living in a particular place and time give us at best a motive to suspect what others say. By themselves, though, they aren't objections. Historical tales aren't sufficient for undermining someone else's claim or arguments. If they were, then nothing would be worthy of belief.

I'll get off my anti-postmodernist hobbyhorse  Whoa! Steady there!  and move on to what really bothers me about Professor Vendler's lecture. After all, what I've just discussed is found in her brief opening remarks, and so perhaps none of it was meant too earnestly. We might think of it as a bit of theoretical throat-clearing.

Therefore, let's look at her remarks about philosophy. Here's my question: Since when was philosophy one of the two central objects of humanistic study in our colleges and universities?

Professor Vendler wants to shift the focus from history and philosophy to "the products of aesthetic endeavor." Well, as regards philosophy, that ship sailed a long time ago. A philosophy department is typically much smaller than a history or an English department, and typically has fewer majors than either of these other two departments. There might be some schools where this is not the case, but I assure you that such places are the exception, not the rule. Just go to a good bookstore and look at the philosophy, history, and literature sections. You'll quickly see which of the three is the smallest section.

In some understandings of the humanities, yes, it's true, philosophy is accorded a central, foundational role, but the actual practice of institutions of higher education relegates philosophy to an increasingly minor role in the intellectual lives of their students (and has been doing so for a long time). The situation has became so bad in recent years that applied ethics has become the fastest growing area within philosophy, as philosophy departments struggle to demonstrate to their deans that they're usefully contributing to the careerist ambitions that today's administrators harbor for today's students.

If what I've just described isn't the case at your institution of higher learning, then I suspect that you're studying or teaching at the University of Paris and Thomas Aquinas is on the faculty.

June 23, 2004

Niall Ferguson has become an academic celebrity of the sort that we don't see very often. His work is turning up everywhere on the web. There is so much of it that I come across it without having to search for it.

Ferguson has just written a new piece called "The End of Power" that warns the U.S. against retreating from the role of global hegemon.

June 21, 2004

I flew back to Texas on June 21, 2003. Today marks the anniversary of my returning to live in my hometown. Since anniversaries are occasions for reflection, I thought that I would post a few remarks about this past year. I'll mostly talk about getting settled after the move and getting back to work on my various projects.

The biggest difference between this year in Texas and the previous seventeen years in Philadelphia is that this year was the first one in which I was not in the classroom, either as a student or a professor. In fact, it was the first year that I had not been in the classroom since, believe it or not, I was in kindergarten. I went straight from kindergarten to public school to college to graduate school and then, finally, to teaching philosophy in the Philadelphia area. That's roughly thirty-five years in a row, without a pause or break.

It's been odd, although not as odd as it could have been. I finished a year as a visiting assistant professor at Haverford College right before I moved back, but I haven't had a job since then. Although I haven't been teaching, I've been very busy. Here are the year's highlights.

When I returned to Texas, I moved into my parents' house, which they bought thirty-three years ago. We moved in on April 7, 1971, as I recall. The house was a bit run-down, because my father had been preoccupied with my mother's multiple sclerosis for a long time. He hadn't been able to take care of the house or the yard properly, and her wheelchair had banged up walls and furniture over the years. In short, the place needed a lot of work. I asked friends to recommend a builder to me, and I settled on someone who turned out to be fantastic at his job. (A nice guy, too. Very easy to get along with.) I told him what I wanted done (and I became increasingly ambitious, since I figured that I might as well do as much as possible as quickly as possible), and he lined everything up with his subcontractors. I didn't have to do too much, except choose fixtures, appliances, carpet, and tile.

Oh, yes, I can't forget to mention the colors! I had to choose colors. I never knew that there were so many colors in the whole goddamned world. For example, when I had to choose the paint for the walls and ceilings, my builder gave me a sample block that must have contained 5000 shades of every color known to mankind. The number of shades of white alone was mind-boggling. Literally, I couldn't think. I asked myself, "Where's a queer eye when a straight guy needs one?" I soldiered on, nonetheless. I didn't spend all those years teaching aesthetics for nothing.

My most important task, though, was to stay out of the way, which I did admirably, if I do say so myself. It was easy, despite the fact that the contractors preferred to show up early in the morning. I lost a lot of sleep during the renovations.

I separated the work into two phases. The first phase, which occurred mostly in July and August 2003, but small parts of which dragged into October 2003, was devoted to renovating the interior of the house. The second phase, which occurred mostly in January and February 2004, first involved getting several new doors inside the house as well as a new garage door and then getting new windows and metal siding for the exterior of the house. By the time all the work was finished I had spent . . . well, a great deal of money. But all of it was money well spent. The house is much, much nicer to look at and to live in.

Furthermore, since I made changes of some sort to about 90 per cent of the interior, I'm not reminded as much of my parents as before. That's more important than you might realize, especially if you haven't had to live in a house filled with memories of people who are now deceased. I found the place rather oppressive when I moved back, but now, what with the renovations and an additional year to grieve for my parents, living here isn't a problem anymore, at least not on account of thoughts of my mother and father. I would have been willing to pay twice as much for the renovations if it had been necessary for making the house a more congenial place to live. Fortunately, that wasn't necessary.

Between the two phases I started to work on philosophy again. I began researching an introduction for a reprint of a translation of a book by Moses Mendelssohn. (See these earlier posts here and here for more information on this project.) I also began translating an essay by Johann Gottlieb Fichte for a book of translations and commentary that I'm doing with my friend Yolanda Estes. (She recently received tenure at Mississippi State University. Congratulations, Yolanda.) I'm about two-thirds of the way through that piece by Fichte. And, finally, I researched and began writing a chapter on Fichte for a book. I'm in the middle of the first draft as I write this.

I've been thinking a bit about two book projects, one on philosophy and horror film, the other on Max Horkheimer and Theodor Adorno's book entitled Dialectic of Enlightenment. (The latter project is another long story, but this earlier post contains some basic information. I also once gave a lecture on the first two chapters of Horkheimer and Adorno's book.) Once I finish the Fichte chapter, I'll be able to devote more time to my book ideas.

Looking back over the year I realize that I haven't been sick once, not even for a single day. I've had some mild problems with allergies, but that's just because of the change in climate. I never had such problems in Philadelphia. The sort of stress that I felt several years ago, especially in late 2000 and early 2001, is long gone. (See the relevant portions of my academic autobiography for the story of my experiences with stress.) I'm convinced, by the way, although I could never prove it, that my father became susceptible to cancer because of the many years of stress associated with taking care of my mother.

For the first time in a very long time, I've been able to sleep without having to drag myself out of bed to go to work. During my ten years as a professor I was severely sleep-deprived, since I routinely had a teaching load that exceeded that of a full-time teaching position. Furthermore, I didn't just teach the same handful of classes again and again as many professors do. Because I was an adjunct, I had to develop many new classes in order to find work. Since my academic autobiography explains all of this, I won't go into it here. None of the foregoing includes all the time spent looking in vain for a job, nor the time devoted to my scholarly writing and translating.

The only potentially restful year was my year at Haverford, but I had to deal with legal and estate business while I was teaching there, and therefore had to travel back and forth between Philadelphia and Texas quite a bit. Consequently, I didn't get much sleep then, either, although that wasn't the fault of anyone at Haverford.

Here's the irony, though. Now that I'm no longer sleep-deprived, I'm having trouble with insomnia for the first time in several years. If it's not one thing, then it's always another. The human body  or, perhaps in this case, the human mind  is truly perverse.

One of my former students from Haverford recently asked me a question that I wanted to take up in this post, since it essentially relates to my experiences of the past year. He asked me whether or not I ever intend to return to academia. He wanted to know how I feel about it now, seeing that I've been out of it for year.

Don't think that I don't ask myself the very same question from time to time. Not long ago I had to consider the issue quite seriously. Here's the short version of the story.

At the beginning of March 2004, much to my surprise, I got a call from a friend of mine at a school that had interviewed me in December 2001. (The school isn't in Texas. I'll say only that much regarding its whereabouts.) That search had been canceled in early 2002 because the state had instituted a hiring freeze on account of the budget problems that many states have experienced in the past three or four years. His department had conducted a new search in late 2003 and early 2004, but they had come up empty. If I understood my friend correctly, they hadn't been able to get someone to take the position; and so they wanted to go back to the results of the aborted search from 2001-2002, that is, to the one that had involved me, among other candidates, of course.

When that search was terminated, I was one of two finalists. In the meantime the other finalist had found a job. Consequently, I was the only one left standing. My friend was more or less, kinda sorta, offering me a job, even though a lot would had to have been done before a real offer could have been made. I told him that I wasn't interested, and I thanked him for thinking of me.

I thought seriously about what my friend was saying, but I didn't have to think very long. I've moved three times in the past four years, twice within Philadelphia, once back to Texas. By the time my friend called I had spent a lot of money on my house. I was in the process of getting settled, and it was nice to live in a house of my own for the first time in my life.

Tenure at my friend's school probably would not have been much of a problem. I already have a significant publication record, one that would only grow in the next few years. The teaching load would have been heavy  four classes in the fall, and four in the spring. But I wouldn't have had to teach as many different classes as I had done in Philadelphia. And so on. It wouldn't have been a bad job, but it wouldn't have been a great one.

Since I don't need a full-time job anymore, I don't have to take whatever job might come my way. Most of all, though, I didn't want to move again, especially for a tenure-track position (which it was) that might dry up and blow away if the state had another fiscal crisis. As I said, I thought about the job, but declining my friend's preliminary offer to start the administrative ball rolling was an easy call.

I had to laugh, by the way. The closest I've ever come to a full-time position was in a year when I hadn't even applied for a job of any sort. I'm still somewhat grimly amused by the whole affair.

If someone wants to hire me as an associate professor with tenure, then I'll sell my house and move across the country. I'll even consider a tenure-track position as an assistant professor at a school like Haverford if I'm ever offered one. Until one of those offers knocks on my door, I'll stay here and work on my writing and translating.

But let's face facts. I'll turn forty-one in less than a month. Say that I finish the Fichte translations in the next two years (my deadline is in 2006) and write at least one book of my own in the next three or four years. I could then return to the job market that much more accomplished. Well, who's going to hire a forty-five year old white man who received his Ph.D. when he was twenty-nine, and who hasn't held an academic job of any sort for at least five years, regardless of how accomplished he may be? No one, I suspect. Hell, I couldn't get a job when I was twenty-nine and fresh out of graduate school.

Even if I could get a desirable position, would I want one? I mean, would I really want one? This returns me to the question of my former Haverford student. Recall that the academic job market progressively worsened during the economic boom years of the mid- to late-1990s. The percentage of full-time positions sank to roughly 50 per cent. There's no reason to think that things will ever get better, especially now that the country is facing increasing deficits and gargantuan Social Security and Medicare liabilities once the baby boomers begin to retire.

Many of the people in academia who have full-time jobs and are tenured are unhappy. Some of that, of course, is just whining. But when I talk with mature people with reasonable grievances, I get the impression that they're always fighting some battle to establish a decent defensive position to fight a second battle simply so that they won't be completely routed in the third battle. What sort of professional life is that?

The quality of students is not improving, funding is shrinking, and indifference to anything not blunt-headedly careerist is growing. Philosophy departments at all but the best schools are increasingly serving other departments to the exclusion of promoting philosophy as an end in itself. I miss teaching. I miss talking to students and colleagues. I miss the academic lifestyle, even as gnarled as it was for me. I don't have to get involved with the hassles anymore, and I doubt that the good things are good enough to make an academic career desirable for someone like me. Consequently, I doubt that I'll ever go back.

Maybe I'll change my mind at some point, but at the moment I don't see myself returning to academia, assuming that I might be allowed to return in the first place. Just because I show up looking for work doesn't mean that anyone will actually give me any. I might teach a class or two again, but the pay would have to make it worth my while. When I was in Philadelphia, it wasn't too hard to get work that paid decently, even though it was work as an adjunct. That won't happen in the Dallas/Ft. Worth area, unfortunately. If I stay in Texas, then it's extremely doubtful that I'll go back to teaching. If I move somewhere else, perhaps I would start teaching again. But it's too early to tell what I'll do. For the immediate future, however, I can say that I'll stay at home and not teach.

Therefore, I now assume that my teaching career is over and done with. The past year has helped me to see that I'll probably never teach again, and, more important, that that fact doesn't bother me too much anymore. It did at first, believe me. But I came to realize that I always enjoyed teaching interesting subjects to excellent students; as for the other subjects and the other students, well, sometimes I enjoyed the experience, sometimes I didn't. All academics are that way. Unlike most of them, however, I have a choice about whether or not to keep teaching. Right now I choose not to teach. I'm comfortable with that decision, which I find a bit surprising, since I never expected that I would have to make such a choice. But interesting subjects and excellent students aren't the rule. They're the exception. I've been spoiled by teaching at the University of Pennsylvania, Bryn Mawr, and Haverford. I don't know that I could teach, say, at the state university that employed my father for thirty-two years. I wonder how he managed to stay sane for so long.

So, to put an end to this post, I'm settled in my hometown in Texas and content not to be teaching any longer. I'm not happy about not teaching, but I can live without it.

P.S. Overall, by the way, I should mention that I liked Haverford best of all the schools that employed me. Haverford treated me decently, and it's a nice place with serious students. Lots of ducks, too. Students at the other places I taught were often serious, of course, but the combination of things at Haverford made it my favorite.

P.P.S. I finally chose egret as my paint color, but my walls and ceilings don't look like a bird's plumage to me. They look off-white. I can't claim to understand the difference. It's just a mystery.

June 17, 2004

Since I have only a textbook knowledge of Islam, I have to rely on other scholars and researchers for any insight into whatever connection there may be between Islam and Islamic terrorism. This article, entitled "The Religious Sources of Islamic Terrorism," appeared in the June 2004 issue of Policy Review. Despite the author's erudition, I find that I'm very disappointed with his strategic recommendations on how to fight Islamic terrorism.

The author is Shmuel Bar, an Israeli scholar. He begins by noting Western reluctance (which may be sincere or mealy-mouthed) to trace Islamic terrorism back to the tenets of Islam itself. Instead, various social and cultural grievances are usually highlighted as the cause of Islamic terrorism. (It's here that the "root cause" analysis finds its place.) Bar, however, thinks that Islam clearly has something to do with Islamic terrorism. Yet he offers the following warning at the end of his second paragraph:

A skeptic may note that many societies can put claim to similar grievances but have not given birth to religious-based ideologies that justify no-holds-barred terrorism. Nevertheless an interpretation which places the blame for terrorism on religious and cultural traits runs the risk of being branded as bigoted and Islamophobic.

I don't know about you, but whenever I come across such an admonition, I have to restrain myself from feeling bigoted and phobic by the end of the discussion. Usually, the author in question seems to make a strong case for the very bigotry and phobia that he or she is hoping to quash. But let's press foward.

Bar turns to radical Islam and jihad. The former, he says, has the following underpinning:

The underlying element in the radical Islamist worldview is ahistoric and dichotomist: Perfection lies in the ways of the Prophet and the events of his time; therefore, religious innovations, philosophical relativism, and intellectual or political pluralism are anathema. In such a worldview, there can exist only two camps  Dar al-Islam ("The House of Islam"  i.e., the Muslim countries) and Dar al-Harb ("The House of War"  i.e., countries ruled by any regime but Islam)  which are pitted against each other until the final victory of Islam. These concepts are carried to their extreme conclusion by the radicals; however, they have deep roots in mainstream Islam.

Well, I ask myself, if the concepts that undergird radical Islam have deep roots in mainstream Islam, then doesn't that mean that mainstream Islam is bound to be largely sympathetic to radical Islam? If so, Bar has failed to reassure me; and his views only become more disturbing to my peace of mind as he unfolds them further.

The 1980s, says Bar, witnessed jihad against the Soviets in Afghanistan  a holy war to be fought not only by Muslims in Afghanistan but also by other Muslims from nearby countries. All Muslims, it seems, are obliged to prevent the reversion to non-Islamic rule of lands once ruled by Islamic law, and since there are so many lands in which this has occurred, it seems to follow that all Muslims have a duty to join the jihad.

The jihad against the Soviets, Bar claims, is where many of our current troubles were born:

The Soviet defeat in Afghanistan and the subsequent fall of the Soviet Union were perceived as an eschatological sign, adumbrating the renewal of the jihad against the infidel world at large and the apocalyptical war between Islam and heresy which will result in the rule of Islam in the world. Along with the renewal of the jihad, the Islamist Weltanschauung, which emerged from the Afghani crucible, developed a Thanatophile ideology in which death is idealized as a desired goal and not a necessary evil in war.

An offshoot of this philosophy poses a dilemma for theories of deterrence. The Islamic traditions of war allow the Muslim forces to retreat if their numerical strength is less than half that of the enemy. Other traditions go further and allow retreat only in the face of a tenfold superiority of the enemy. The reasoning is that the act of jihad is, by definition, an act of faith in Allah. By fighting a weaker or equal enemy, the Muslim is relying on his own strength and not on Allah; by entering the fray against all odds, the mujahed is proving his utter faith in Allah and will be rewarded accordingly.

Bar then discusses some of the legal issues surrounding jihad: Who is to participate? Which means are acceptable and which are forbidden? And so forth. Nothing that he writes indicates that mainstream or moderate Muslims should somehow regard jihad as alien to their understanding of Islam. Quite the opposite is true, he says:

It can be safely assumed that the great majority of Muslims in the world have no desire to join a jihad or to politicize their religion. However, it is also true that insofar as religious establishments in most of the Arabian peninsula, in Iran, and in much of Egypt and North Africa are concerned, the radical ideology does not represent a marginal and extremist perversion of Islam but rather a genuine and increasingly mainstream interpretation.

Furthermore, moderate Muslims fear being labeled apostates; consequently, they are very wary of confronting the radicals:

Moderates are reluctant to come forward and to risk being accused of apostasy. For this very reason, many Muslim regimes in the Middle East and Asia are reluctant to crack down on the religious aspects of radical Islam and satisfy themselves with dealing with the political violence alone. By way of appeasement politics, they trade tolerance of jihad elsewhere for local calm. Thus, they lose ground to radicals in their societies.

If local governments aren't especially willing to combat the radicals who engage in terrorism, then what can be done? Bar says that we need a comprehensive strategy that remains true to our democratic values. He asks whether or not a strategy addressing the ideological roots of radical Islam is a possible one. He doesn't really answer his own question, unfortunately. Here is what his strategy would look like, leaving aside for a moment the issue of its likely success:

First, such a strategy must be based on an acceptance of the fact that for the first time since the Crusades, Western civilization finds itself involved in a religious war; the conflict has been defined by the attacking side as such with the eschatological goal of the destruction of Western civilization. The goal of the West cannot be defense alone or military offense or democratization of the Middle East as a panacea. It must include a religious-ideological dimension: active pressure for religious reform in the Muslim world and pressure on the orthodox Islamic establishment in the West and the Middle East not only to disengage itself clearly from any justification of violence, but also to pit itself against the radical camp in a clear demarcation of boundaries.

Such disengagement cannot be accomplished by Western-style declarations of condemnation. It must include clear and binding legal rulings by religious authorities which contradict the axioms of the radical worldview and virtually "excommunicate" the radicals. In essence, the radical narrative, which promises paradise to those who perpetrate acts of terrorism, must be met by an equally legitimate religious force which guarantees hellfire for the same acts.

He then fills out the details with six bullet points that I won't quote. You'll find them towards the end of the article.

Overall, I find Bar's recommendations very disappointing. He realizes that he is calling for an "Islamic Kulturkampf" (his phrase, found in his penultimate paragraph). But given what he has written earlier, can we reasonably expect one to take place? If we really are in a religious war (once again, his phrase, which I quoted above), is it rational to premise part of our strategy on reform within the Islamic world, when we have no good reason to believe that moderates will challenge radicals in the manner required by Bar's strategy? Perhaps they are willing to confront terrorists in some fashion to save themselves. But will they do so on our behalf? Bar doesn't answer this question.

Are moderates (and, of course, they exist) really willing to act in a fashion that in effect makes them our allies and the radicals (who are their co-religionists) their enemies? Perhaps they are, but why should we believe that they are so willing? What evidence of their willingness has Bar provided? In short, has Bar given us a viable strategy? The policy that he advocates seems nothing but fanciful.

As the Bush administration has learned during the past year, hope is not a plan. Bar is from Israel, as I noted at the beginning of this post. Could Israel have had any success with the strategy that he proposes? Not as far as I can tell. Look at how easy it has been to radicalize the Palestinian people. Perhaps the West might more readily pursue the strategy that Bar recommends, but his article isn't at all encouraging on this point. Instead, if Bar is right in his diagnosis of the religious roots of Islamic terrorism, then his prognosis ought to be commensurately gloomy. That is, we may be in a religious war without an end in the foreseeable future.

June 15, 2004

This column by David Brooks draws part of its argument from a book by John B. Judis and Ruy Teixeira entitled The Emerging Democratic Majority. Brooks argues, in effect, that rivalries within the educated class, which he sums up as the conflict between professionals and managers, are responsible for much of the political polarization in the U.S. The notion of professionals that Brooks employs, and the claims about their voting habits, much of that comes from Judis and Teixeira's book.

I highly recommend The Emerging Democratic Majority, since it's a serious study of recent political history and a hopeful work of prognostication for Democrats. (I have a yellow dog around here somewhere. Come here, boy!) The basic idea is that various demographic and cultural trends favor the Democrats, hence the book's title. You can be sure that Karl Rove has read the book. (Get the paperback edition. It contains additional material not found in the original hardback edition.)

Ruy Teixeira has a blog entitled The Emerging Democratic Majority WebLog. It's very useful for understanding polling data, among other things. In the right-hand column you'll see some links to his writings.

P.S. This article from Slate discusses some of the recent liberal criticisms of Brooks' work.

I was wondering when I would come across an article that would draw attention to an obvious fact about all of the mourning for Ronald Reagan:

If a politician died and no white people wept, would he be painted an American hero?

If he drew throngs of mourners made up almost exclusively of minorities, would the mainstream news media insist that he was universally loved by all Americans?

Would the networks subject their viewers to the 24/7 non-stop hero worship being accorded Ronald Reagan, if all but a tiny handful of the thousands and thousands of Americans paying tribute were brown instead of white?

I think not.

In its eagerness to characterize Ronald Reagan as an American icon based upon the "outpouring" of grief at his passing, the media are ignoring an important yet unavoidable fact: there are hardly any people of color singing Reagan's praises. For days, we've watched eager commentators tell us ad nauseam that the mourners waiting to view Reagan's casket represent a "cross-section of America." But if they just turned around, they'd see what we see  a virtually all white tableau snaking behind them.

Why such negativity? Lots of reasons, as far as S. J. Jones, the author of this article, is concerned. First of all, she hasn't forgotten how Reagan began his campaign in 1980:

To many of us, Ronald Reagan was not a great man. He was not a hero. In fact, many of us have nothing but painful memories of his presidency and what he stood for.

This pain was first inflicted like a punch in the stomach back in 1980 when Ronald Reagan journeyed to Philadelphia, Mississippi, the site of the murder of three civil rights workers 16 years before, to kick off his general election campaign with a speech endorsing states' rights.

"I believe in states' rights," he said on that August day. "I believe we have distorted the balance of our government today by giving powers that were never intended to be given in the Constitution to that federal establishment." He went on to promise to "restore to states and local governments the power that properly belongs to them."

Black Americans  and those hostile to our interests  knew exactly what he meant. "States' rights" has long been a code word for segregation, discrimination and massive resistance to the federal government's efforts to stop southern states from oppressing blacks. States' rights was the excuse given for denying blacks the right to vote, access to public accommodations, equal protection of the laws. It was such a deeply held principle among some Southerners that they even lynched folks who interfered with it.

And Ronald Reagan turned up in the place where James Chaney, Michael Schwerner and Andrew Goodman were murdered for registering black voters, to affirm his commitment to the principle that led to that and countless other terrorist acts and to promise that he would fight to turn back the clock to the days when local governments had the power to treat minorities any way they damned well pleased. And the 30,000 white folks who crammed the Neshoba County Fairgrounds that day cheered him wildly and then helped send him straight to the White House.

That final paragraph overstates what Reagan personally intended, I think, but he was perfectly happy to gain support among white Southerners by letting them think that he was more like them than he really was. The rest of the article is a useful reminder of how a large portion of the country sees Reagan to this day.

June 14, 2004

Two former students, both of whom studed Max Horkheimer & Theodor Adorno's Dialectic of Enlightenment with me, have emailed me about this story in The New York Times. It's about a new Canadian documentary called The Corporation. I should say that I haven't seen the film.

The article makes interesting use of Horkheimer & Adorno's book. Here's the first paragraph:

In their 1944 work, ''Dialectic of Enlightenment,'' Max Horkheimer and Theodor Adorno advanced a theory on the far-reaching power of what they called ''the culture industry.'' This entity, encompassing all forms of mass culture, media and the businesses behind them, made up such a totalizing system that it was literally impossible to rebel against it. This complex not only anticipated the urge to revolt but would sell you something to satisfy it. (Che Guevara T-shirt, anyone?) It's a resoundingly depressing theory but an interesting one to recall, because anticorporate sentiment is lately prominent in pop culture.

The film is said to be the latest example of this anticorporate sentiment. It might be worthwhile, but I haven't yet heard from anyone who has seen it. If it comes to Dallas, then I could see it. My hometown is too small for the local theaters to feature a documentary.

Towards the end of the article there is some fulminating about the possibility of dissent within the confines of the culture industry. As far as I'm concerned, this wasn't one of Max Horkheimer and Theodor Adorno's major concerns. They were more worried about the transformation of autonomous art (as they labeled it) into a form of mere entertainment that is simply an extension of the working day. (That is, to fudge the theory a great deal, we amuse ourselves in the evening to rest up for the next day's labor.) Naturally, that change would have an effect on what people want to read and watch, and, presumably, the political content would usually be low or non-existent.

But since Horkheimer and Adorno never saw the primary purpose of autonomous art in a straightforwardly political light, the absence of dissent in the products of the culture industry isn't their main concern. Many of the greatest works of art have no obvious political content or ambition. Horkheimer and Adorno were well aware of that.

Anyway, I was glad to see discussion of their book turn up in a piece in the The New York Times. I'm currently talking with an editor about writing a monograph on Dialectic of Enlightenment. It's one of my favorite works in 20th century continental philosophy, and no one has ever tried to comment on the book from start to finish. I taught it numerous times over the years, and now I'd like to write something about it.

If you haven't seen Preston Sturges' Sullivan's Travels, then go out and find it. Once you see it, you'll understand why I brought it up in this paragraph, given the overall context of this post.

Acknowledgment: Many thanks to Jonathan Church and Ross Lerner for alerting me to the article in The New York Times.

June 13, 2004

In an earlier post I briefly touched on the issue of anti-Semitism among German philosophers. My basic position is that the anti-Semitism, while it was real, was not embodied in their philosophies. Such a claim requires elaboration and defense, of course. But that's how many philosophers, myself included, view the matter. Scholars from disciplines outside of philosophy are sometimes tempted to see things differently.

I alluded in that earlier post to a notorious passage from Fichte, but I didn't quote it. There's no free-standing translation of the book in question, but his remarks about Jews (a few paragraphs long) are translated and discussed in various places. Besides chapter 8 of Paul Lawrence Rose's Revolutionary Antisemitism in Germany from Kant to Wagner, I'd also recommend chapter 5 of Anthony J. La Vopa's Fichte: The Self and the Calling of Philosophy, 1762-1799 (Cambridge University Press, 2001).

In this post I want to quote a passage from Kant, who is a much more significant thinker than Fichte (as even Fichte scholars would acknowledge), and thus an anti-Semitic observation from him is worth more attention. The remark comes from Anthropology from a Pragmatic Point of View, a manual for a lecture course that Kant gave for nearly thirty years. It was first published as a book in 1798. The passage appears in §46, which is entitled "On Mental Deficiencies in the Cognitive Power" ["Von den Gemüthsschwächen im Erkenntnißvermögen"]:

The Palestinians living among us have, for the most part, earned a not unfounded reputation for being cheaters, because of their spirit of usury since their exile. Certainly, it seems strange to conceive of a nation of cheaters; but it is just as odd to think of a nation of merchants, the great majority of whom, bound by an ancient superstition that is recognized by the State they live in, seek no civil dignity and try to make up for this loss by the advantage of duping the people among whom they find refuge, and even one another. The situation could not be otherwise, given a whole nation of merchants, as non-productive members of society (for example, the Jews in Poland). So their constitution, which is sanctioned by ancient precepts and even by the people among whom they live (since we have certain sacred writings in common with them), cannot consistently be abolished  even though the supreme principle of their morality in trading with us is "Let the buyer beware." I shall not engage in the futile undertaking of lecturing to these people, in terms of morality, about cheating and honesty. Instead, I shall present my conjectures about the origin of this peculiar constitution (the constitution, namely, of a nation of merchants). [Quoted in Immanuel Kant, Anthropology from a Pragmatic Point of View, tr. Mary J. Gregor (The Hague: Martinus Nijhoff, 1974), p. 77.]

Kant then offers his conjectures about the origin of the Jews as a nation of merchants  Palestine, as he puts it, was well-situated for the caravan trade, and so on.

What to say? Well, first, I have to admit to some uncertainty as to how the word Verfassung, which Gregor has translated as "constitution," is to be to understood. Frequently, the German term has a political connotation: that is, a Verfassung can be a constitution, a political document of some sort. But that doesn't seem to be the proper way to understand the term as it is used in this passage. (I'm open, though, to suggestions as to how to read the word in a political sense.)

Instead, Kant seems to mean something like "state of mind" or "mental condition," both of which can plausibly translate Verfassung and make more sense in the context of the overall discussion, which, as the section title indicates, is devoted to the question of mental deficiencies. In the case of the Jews, Kant is attributing to them the mental deficiency of being habitually dishonest. (Don't forget, by the way, his "for the most part" qualifier, which seems to apply to the number of Jews who are cheaters. That is, Kant doesn't seem to be saying that all Jews are dishonest.) I'd say that he's referring to what he takes to be their mental constitution, which, in the context under discussion, amounts to the grievous moral failing of dishonesty, especially in commerce. (I skimmed through the entire text of Kant's Anthropology, but I didn't find any other places where he used the word Verfassung that might help us here. I admit, though, that I could have missed something. I'm blogging here, after all, not writing a scholarly paper.)

Furthermore, he seems to postulate the perpetual existence of this deficiency as long as Jews (1) remain a nation of merchants who (2) reside in non-Jewish countries, (3) make up for their lack of civil dignity (i.e., their second-class citizenship) through dishonest business practices, and (4) abide by ancient religious precepts that sanction their behavior. (Kant doesn't seem to say that their precepts are the cause of the dishonesty that he attributes to the majority of them.) That's why he says that their constitution  or however else we might translate Verfassung  cannot consistently be abolished.

I don't know why Kant wrote this particular remark. No one does, as far as I know. I spend little time condemning the failings, moral or otherwise, of people who are long dead. Such an activity I consider a form of self-righteousness. Kant's remark speaks for itself, and nowadays we know what it says. Let's leave it at that.

The important intellectual consideration is whether or not the remark  which expresses an attitude towards Jews that Kant held at some point in his life  somehow informs his philosophical writings. Since Kant gave his anthropology course many times, we don't know when he penned this remark, nor whether he repeated it every time he gave the course, and thus we don't know for sure whether or not he believed it until the end of his life. But let's say, for the sake of argument, that Kant wholeheartedly believed, throughout his entire life, that the Jews were a nation of cheaters. Does it matter for understanding his philosophy?

Kant's moral philosophy is clearly meant to be universal in scope and application. Therefore, it assigns the same duties and rights to all human beings. Furthermore, according to Kant, the failings of others do not excuse us from our obligations towards them, however sorely they might test our patience. (Two wrongs don't make a right, as the saying goes.) Kant says that he refuses to lecture the Jews about their failings, because he thinks that to do so would be futile. But he does not say that their alleged failings excuse us from our obligations towards them.

Since the observation in the quoted passage is an empirical falsehood, it isn't a consequence of Kant's philosophical views, which are never to be mistaken for empirical generalizations of any sort. That is, what Kant says about the Jews can't be a product of his philosophy. A general condemnation of dishonesty  applied to Jews and non-Jews alike  is to be expected from his moral philosophy, but the sweeping generalization about Jews isn't a philosophical view. It's just an empirical falsehood that expresses a prejudice.

There's nothing in this passage that should prompt any reflection about the nature of Kant's philosophy. For some reason, which no one has been able to fathom, Kant subscribed to one of the ancient stereotypes of the Jewish people. From a personal point of view, clearly, it's lamentable; from a philosophical point of view, however, it's irrelevant.

It's official: Though the economy is clearly expanding and jobs are coming back, the benefits of growth are once again accruing to the wealthy. After a brief hiatus during the late 1990s, economic inequality is reasserting itself.

No less than the nation's chief economist, Federal Reserve Chairman Alan Greenspan, noted this in recent testimony, stating that "most of the recent increases in productivity have been reflected in a sharp rise in the pretax profits . . . " This trend stands in sharp contrast to the way growth was apportioned just a few years ago, when the benefits of workers' increased efficiency were broadly shared.

The difference between then and now -- and the key determinant as to how the benefits of growth are distributed -- is the tautness of the labor market.

But without low unemployment most workers aren't seeing much in the way of wage gains. Why? They lack the bargaining power to press for them:

This combination of strong productivity growth and weak labor markets translates into wage stagnation for most, along with increased inequality. Full-time workers' weekly earnings, adjusted for inflation, show a widening gap between the highest and lowest wages. For workers below the 75th percentile -- those earning less than the top 25 percent are earning -- real earnings grew by less than one percent. Only those at the top of the wage scale have benefited from the economic recovery, as real earnings at the 90th percentile grew 2.5 percent for men and 4.5 percent for women. These findings suggest that at least three-quarters of adult, full-time workers currently lack the bargaining power to press for a fair slice of the expanding pie. They are contributing impressively to this economy, but it is not returning the favor.

An earlier article from The American Prospect  this time the author is Lawrence Mishel  included some eye-opening figures about income inequality:

The top 1 percent of families earned 9.3 percent of all income in 1980. By 2000, this income share had increased to 19.6 percent. Correspondingly, the income share of the bottom 90 percent declined from 66 percent to 53.9 percent. There were small gains (1.9 percentage points) in the income shares of the remaining group, the 90th to 99th percentiles.

From 1980 to 2000, the incomes of the upper 1 percent increased 179 percent, while those of the bottom 90 percent increased by 8 percent.

In 1970, the ratio of top executive earnings to that of the average worker was 38.6 to 1. This ratio increased to 101.1 by 1980, to 222 by 1990, and to 1046 in 1999.

I've seen such figures before, but they always manage to take me by surprise whenever I see them again. Matt Miller's idea of adding a feature called Still True Today to national newspapers seems like a step in the right direction for combatting the absence of such genuinely vital material from our political conversations.