Thursday, May 26, 2005

...well, not exactly 40 years ago today. But it was 40 years ago, in 1965, that the world of pop music changed--you might say, turned a corner, or came into its own.

Sometime last week, I popped The Beatles' Rubber Soul into the CD player at the office. How many times have I listened to that record? Twenty, fifty, a hundred? Anyway, it was in the background: "Drive My Car," then "Norwegian Wood." And then "You Won't See Me"--just another Beatles love song, one that has never, to my knowledge, received much analysis or praise. Yet, somehow, it commanded my attention, right from the opening lines:

When I call, you upYour line's engagedI have had, enoughSo act your age

And I thought: what a perfectly wonderful lyric. What a perfectly wonderful melody. What a perfectly wonderful little song. And that's really the whole story of pop music, isn't it? The catch, the hook, the chord change and turn of phrase, that somehow makes the whole thing a work of art--disposable art, perhaps, something profoundly synthetic, repeatable, contained. But art nonetheless.

I sometimes play around with friends about what "pop music" means. If it has any meaning besides that which record companies want to give it for marketing purposes (and maybe it doesn't), then that meaning probably resides in the idea that there are certain artistic conventions that are "popular"--not in the sense of necessarily appealing to the whole population (because what could ever do that?), but in the sense of being accessible to mass dissemination and consumption. Big, sprawling, demanding works of music and art, ones that depend upon innumerable circumstances and variables, can't ever be "popular" in this sense, though certainly some can have a lot more fans and attract a lot more attention that others. (Think opera, for example: there's no opera that can or ever could be described as "pop music," though clearly La Bohème is a lot more popular than, say, Elektra.) Real pop music only emerged when there was developed the sort of technology (both human and mechanical) for musicians and other artists to produce work that could be performed and reproduced in a reliable way. That doesn't mean pop music was a completely modern phenomenon; you could, perhaps, argue that a lot of chamber music was "pop music," particularly in the hands of a master like Mozart (think of Eine kleine Nachtmusik, or other such pieces: Mozart, in writing such music, didn't really innovate, but simply wrote the same way everybody else was writing...only he did it, within those limited conventions, better than anybody else at the time could). But broadly speaking, it was only with the 20th century--after Gilbert and Sullivan, after the rise of Tin Pan Alley, with the radio revolution--that we really began to see "pop music" as a separate musical form. The four-minute song (verse, chorus, verse, chorus, bridge, chorus...), the back-up band, the singer and songwriter: all settled into a well-established groove, and its easy dominance shaped (to the fury of many musicologists and preservationists ever since) much of whatever country, bluegrass, blues and folk music managed to make it into the recording studio in the decades which followed. (Classical music went its own way, while jazz, for a while at least, managed to be an occasional exception.) The majority of the music made by both Frank Sinatra and Elvis Presley shared a similar "pop" architecture, however radically different their sounds.

And then...The Beatles. I like how Colby Cosh put it in this column a while back:

There are a lot of things people haven't quite absorbed about the Beatles. I remember being dumbstruck by a passage in Ian Macdonald's Revolution in the Head, which, incidentally, has no credible rival as the best book ever written about rock music. Macdonald paints us a scene of the Beatles' earliest days as celebrities, after they'd just migrated to London. We meet two promising young performers whose blues-influenced band has just been signed to a contract: a Mr. Mick Jagger and a Mr. Keith Richards. They've swung by the studio to exchange pleasantries and watch the northern quartet at work. They watch as McCartney and Lennon, equipped with a chorus and a verse they've worked on earlier, try to stitch things together with a middle eight, messing around on a piano. In maybe fifteen minutes of banter and tinkling, they've knitted a few chords into a future hit. This is all completely foreign to Mick and Keef, and as they watch, the penny drops. They realize that a person who can play an instrument can write a song. There's no magic to it--no massive leather-bound manual locked in a Brill Building safe that tells you how to do it. It hadn't occurred to them that there could be such an animal as a rock-and-roll group that wrote its own songs.Of course, it's not as straightforward as all that. Lots of innovations were always rippling through the pop music ocean; there was always tension and critical standards and expectations being overturned. Even in terms of rock-and-roll, this story is incomplete; as Colby notes, MacDonald forgets that Buddy Holly was breaking all the rules that the Beatles broke back when they were still all individually listening to skiffle music. But still, there it is--sometime in the middle of the 1960s, the whole pop music world was re-oriented, pop music became the province of those who were themselves caught up in extant musical currents, rather than those who were standing somewhere to the side of it; pop music was released from its "industrialized vial," in Colby's words.

When was the big moment? Why not make it 1965? In 1965 the Beatles kicked off their U.S. tour playing Shea Stadium, the biggest (and loudest) rock concert ever up until that point, a concert that the Beatles later looked back on as a turning point: when it was all over, they retreated to the studio, never to tour again (though Paul wanted to). In 1965 they released Help!, the last Beatles album to feature a song they didn't write themselves, with a couple of sublime Lennon/McCartney gems hinting at what was to come: "Ticket to Ride," and "You've Got to Hide Your Love Away." And then of course in 1965 they released Rubber Soul, with--besides what I've already mentioned--"Nowhere Man," "Drive My Car," "Michelle," and "In My Life." After that year, perhaps, one could really say that the Beatles had changed the world. Certainly they'd changed almost everything about what was taken to be accepted about pop music--about the relation between singer and songwriter, about the role of the studio and recording technology, about what an "authentic" performance consists of, about what you can get away with on the radio--and changed it for good. They didn't write and record the most important or powerful pop music of the 1960s, but they did write and record just about all the best pop songs, and that's enough.

Once, after VH1 or somebody had come out with another one of their ridiculous "Top 100 albums of All Time" lists, I got into one of those predictable arguments with some friends, about what Beatles albums should go where. I ended up suggesting that every such top 100 list should simply leave about five of the top ten slots open, with the words "insert the post-1964 Beatles' album of your choice here." That would just about do it. They may not be your favorite, you may not care for any of their songs, but you've been influenced by them nonetheless. I doubt anyone who has listened to or made any pop music anywhere in Europe or America (or elsewhere!) anytime in the last 40 years can plausibly claim otherwise.

Wednesday, May 25, 2005

My post on Ricoeur has inspired an exchange between myself and an old friend, Damon Linker. It's an exchange which arises from both a disagreemen over certain elements of the history of philosophy, and a disagreement over what it means to make an affirmation, particularly a religious affirmation. In other words, it's about truth and how we moderns--as opposed to those who came before us--recognize and affirm it, and whether Ricoeurian hermeneutics provides (or at least points toward) an answer to this problem or merely distracts us from the heart of the puzzle. I'm going to excerpt one small part of Damon's valuable criticisms, and try to build from that a response that I hope will elaborate a little more clearly some of which I posted on yesterday.

Damon wrote:

What I detect in your hermeneutical tradition is the longing to recapture that "naive" (Riceour's phrase, remember) experience of pre-reflective/pre-critical experience on the far side of Enlightenment skepticism, which it holds to be purely "negative" in the sense that it negates this pre-critical experience by showing its inevitable dependence upon active subjectivity, which produces epistemological relativism and psychological despair or melancholy. Only the "negation of the negation" can lead self-conscious men and women to overcome this impasse and achieve a reconciliation -- i.e., allow someone . . . to be both intensely skeptical and critical while ALSO affirming something analogous to the pre-reflective experience of pure faith.

In replying to Damon, I find myself trying to get clear on what hermeneutic thinkers like Ricoeur (and myself) are actually claiming when we talk about "naivete." Many of the usual histories of hermeneutics talk about it as a product of romanticism or the Counter-Enlightenment; of thinkers like Dilthey, Schleiermacher and others who were working through what Kantian Idealism, and its proposed resolution to the problems of modernity, meant for the study of the social or "human" sciences (Geisteswissenshaften). Short history of modern thought made even shorter: Descartes, Locke and others gave us a model of knowing which rejected the presumptions of medieval Christendom, insisting that naming a thing was a matter of mentally picturing it, not relating to it in reference to a divine, encompassing world or telos; subsequent modern thinkers--like Hume--pointed out that this kind of empirical picturing, methodologically powerful as it may be, implies that the subject and the object are not just profoundly but categorically separated, leaving us without any basis for "knowledge" except our own sentimental habits and preferences; Kant responded to this critique by taking up the idea of categories and positing the subject as an entity wholly and solely concerned with the phenomenal one, or the world of appearances in other words, with morality and all other normative claims being a function of a practical reflection upon the categorical limits of knowability itself: this is the Idealist or "critical" turn in philosophy. Hermeneutics supposedly comes to the fore at this point; with Kant definitively separating dogmatic metaphysical claims from critical claims of knowledge, all that remains is to work out how we interpret the phenomena which appear before us. So far, so good. In a world where the subject is understood to be distinct from the objects it seeks to know, then all that is left for the subject to do is to try to figure out the best way to clarify and interpret its own (limited and inward) pictures of those objects, and that makes the work of descriptive clarification and interpretation--of texts, in particular--paramount.

But after Nietzsche, it became very difficult for all these interpreters (working as they were in the wake of both Kant and Descartes) to explain how matters of power and perspective weren't structured into their every description of a given text or appearancee from the get-go. Early phenomenologists like Husserl hoped to get around this problem by rethinking the emphasis of the critical tradition entirely: knowability (and hence the "moral law within" which Kant held could be constructed on the basis of his response to the subject/object dichotomy) ought to be understood, they argued, as a function of "the things themselves," meaning that the focus of philosophical study ought not to be our consciousness of appearances, but how that thing which appears impacts upon our consciousness. Then came Heidegger, who exploded this whole approach by pointing out that things only "appear" against a field of some kind--and what followed, of course, was the whole revival of ontology, the questioning of the "metaphysics of knowledge" which had been (unconsciously?) buried beneath the Kantian enterprise from the start, the "call of Being," and all the rest. Which resulted in an interesting transformation of hermeneutics, one carried out by such people as Ricoeur (who had spent his years in a WWII German POW camp reading German philosophy, believe it or not): hermeneutics, which was presumed to be a kind of romantic method, was turned against method itself (insofar as "method" meant addressing the subject/object distinction, or the fact/value distinction if you prefer, in an essentially empirical way). This "new" philosophical hermeneutics presented itself as actually dependent upon a much older understanding of subjectivity, one which has deep roots in Christian and Western thought, one which (arguably) certain early Romantics (like J.G. Herder) turned to not because it provided a theory of understanding which accommodated the importance of subjective interpretation in making knowledge and truth claims, but because it echoed older understandings of truth, wherein knowledge was held to be something uncovered or illuminated by and through the subject matter itself. (This is plain when you look into the pneumatological roots of Geisteswissenshaften itself.) In other words, it involved (or at least should have been understood to involve) "believing in order to understand"--a concept which goes back to St. Paul, if not earlier.

What does this have to do with naivete? What it means is that, from Ricoeur's point of view, what is being sought through his kind of philosophical or "anthropological" hermeneutics, through a "wagering" of belief upon the actual (historical, natural, linguistic) conditions or "field" or "horizon" of our interpretive work in the world, is not a return to wholly precritical, prereflective belief. According to this argument, Augustine and all the rest were fully aware of their own subjectivity; the fact that you can find evidence of this awareness all through the medieval period, in the context of debates over the status of knowledge (nominalism, negative theology, etc.), is just additional support for the argument. Hence, the "naivete" which Ricoeur spoke of wasn't a belief that existed without interpretation--after all, the symbols which ancient and medieval believers took up as meaningfully related to the world nonetheless had to be connected to that world through an act of evaluation, though an evaluation which included the person as well as the symbol itself. So Ricoeur is not imagining that there is some way to overcome the subject entirely; the notion that anyone ever did hold to such nonreflective beliefs is a straw man. Thus I think that when Damon doubts the ability of Ricoeurian hermeneutics to get one "back to" the "experience of pure faith," he's imaging faith in terms of Cartesian presumptions, one in which subjective reflection is of course complete different from cognitive correspondence, and that's a faith that I sincerely doubt any serious person has ever entertained.

Some have argued that hermeneutic thinkers like Ricoeur, Gadamer, Taylor and others reject the correspondence theory of truth; I think this is incorrect. What their hermeneutical/philosophical anthropology does do is reject those modern presumptions about the mode of correspondence which drove Kant to conclude that knowledge and morality couldn't draw any force or support from the noumenal, "actual" world. That is, what I see them as doing is rejecting a strictly scientific, "mediational" or representational epistemology, in all of its forms. Under this hermeneutic model, "truth" does not follow from the correspondence of a mental picture (or formula, or methodologically obtained data point) to the appearance of a thing, since it is necessary to also concomitantly evaluate one's relationship to the space of the thing's appearance. But that evaluative relationship is a matter of correspondence--to tradition, to intuition, to prior beliefs which themselves were constructed in the context of a received space or horizon of recognition, and so forth. How this evaluation works, and whether or not it issues in defensible truth claims, is at the heart of the matter. But whatever else it may be, it is not a claim that somehow we can dismiss with a subject-that-corresponds entirely, since there is no believing which isn't at least partly a function of an understanding which assumes some sort of interpretive correspondence.

(Perhaps the misunderstanding here is the fault of Hegel. From within Hegel's philosophy, earlier notions of subjectivity and interpretation--that is, the kind of notions which philosophical hermeneutics emphasizes--truly were marginal, but that is because they (again, from Hegel's point of view) put the cart before the horse: they assumed the priority and presence of the world over and through one's perspective of it. Whereas Hegel insisted that the only solution to the dilemma of modernity and the division of the subjective self from the objective world it gave birth to was (in contrast to Kant's solution of delineating and attaching great importance to the specifics of those resulting divisions) to discover (or impose) a grand unitary spiritual logic within the subjective perspective itself. Whatever historically turned out to be rational would be real, the real data, itself. A brilliant philosophical creation, maybe the greatest in history, but still one which confronts the terms which Descartes set. And the hermeneutic tradition (which Hegel liberally borrowed from, but otherwise had little interest in) makes the claim that it is the Cartesian analysis of subjectivity is what's flawed, not the responses to it. So hermeneutics isn't, I think, trying to get away from a "common sense" understanding of truth, since supposedly for moderns the only way to rescue such sensibility today is to go the route of Hegelian absolutism; rather, they're trying to think again through what modernity defined as the requirements of common sense knowledge in the first place.)

If all this is true, why does Ricoeur talk about a "second naivete," one which is only possible through a kind of "hermeneutics of faith" (as opposed to a "hermeneutics of suspicion")? Well, because while the subject has always been with us, modernity has changed our accounting of subjectivity; whereas in premodern times the "subject" which participated in the interpretation of truth was broader (both spatially and temporally) than a single person, modernity has made possible a thinking about ourselves as apart from the field of evaluation, as independently capable evaluators. And so the old naivete--which Isaiah Berlin, borrowing from Friedrich Schiller, once called a lack of consciousness about the "rifts in one's milieu"--won't do any longer. It's the difference between someone who has only ever been immersed in a single musical tradition--because their identity is concurrent to that tradition--making distinctions between good and bad musical expressions of one's tradition, and someone who has been introduced to a plurality of musical traditions, and now must make such distinctions with an understanding that the evaluative criteria provided by their own believed tradition itself can also be evaluated. This is why identity and recognition have come to play such a huge role in the thinking of these philosophers; figuring out how to account and defend the existence of the interpretive self in the context of what they hold to be manifest demands for belief is no easy task. So now we have to "wager" on interpretation; we have to use it self-consciously and therefore critically, whereas before the presence of the individual subject wasn't crucial to how subjective evaluations were performed.

Clearly, this doesn't satisfy a lot of people. On the one hand, who have people like Rorty and others who simply think that the efforts of hermeneutical thinkers to preserve any difference whatsoever between the content of a thing and our evaluation of it is pointless; there's just the subjective will, they say, and that's all. On the other hand, there are people like my friend Damon (unless I am profoundly misunderstanding him, which is always possible--that's the burden of interpretation, after all), who continue to suspect that all this talk about intuition and illumination and tradition in the context of religious belief is a dodge; that it's a way to play at "meaningfulness" when one can no longer really ascribe normative force to claims that can be contested by the fact of pluralism. And this is where you can see a political point to this highly philosophical debate. The most common "political" accusation lobbed at hermeneutic thinkers like Ricoeur (as Scott McLemee humorously implied yesterday) is that they are "conservative" in denial: that their philosophy amounts to a reactive attachment to religion, tradition, and the status quo, but without any open or straightforward endorsement of such, thereby undermining the possibility of critical thought and progress by default. The most common accusation lobbed against communitarian thinkers (by both liberals and conservatives) is that they are engaged in a kind of cheap nostalgia: that they want to invest society with the sort of meaningfulness that religion and tradition can provide, but without actually embracing the (presumably reactionary) truth claims which would meaningfulness possible. There is no necessary connection between these two sets of thinkers, except this: both want to affirm the possibility of finding in our present, supposedly "self-interpreted" moment, a continuation of collective meanings that have a hold on our claims and our actions. Such a possibility would imply both that 1) the ethical horizon of any politics is a function of a polity's "meaning-full" historical identity; and 2) the meaning of that historical identity of a polity is not static, but is instead constituted by a continuum of interpretive links in an always-manifest chain. So, yes, as the democratic theorist Sheldon Wolin once put it, modern politics is characterized by "the presence of the past" (and, as Richard Bernstein once suggested, there's much which Wolin and other radical thinkers share with supposedly conservative hermeneutic philosophers: specifically, a rejection of an epistemologically mandated restriction in what counts as meaningful political action). The point is to vivify and draw from that past; not set it aside as something which cannot provide conclusions which fit modernity's peculiarly "demonstrable" criteria for what works, for what is true.

Happily nihilistic pragmatists/liberals like Rorty say not to worry about contingency. More admirably, modern-day Kantian liberals like Rawls and Habermas try to justify a politics of strict criteria and method in the face of post-Heideggerian doubts. Other liberals reject both options, but continue to contend that the modern fact of pluralism constitutes a crucial obstacle to the individual affirmation of collective religious truth--and obstacle which I think they also hope can be taken up as a bulwark against religious conservatives who are often themselves simple-mindedly Cartesian about their own political aims. Philosophical hermeneutics, with its hopeful politics of intuition and affirmation, does risk becoming entryway for such conservatives into the liberal order; that is certainly true. But I don't think such avoiding such "naive" wager is worth it if it means not looking to see if, beneath all the method, there isn't a history to our polity and our selves that we can believe in.

Monday, May 23, 2005

Via the invaluable Political Theory Daily Review comes the news that Paul Ricoeur has passed away. (Though, now that I scan the web, it's clear that Nathaniel Robinson and Anthony Paul Smith caught this new fresh over the weekend.) He was 92, a wonderfully advanced age, though still 10 years shy of that reached by the philosopher of hermeneutics so often linked with Ricoeur, Hans-Georg Gadamer. Is there something to be learned from the fact that more than a few of the leading lights of post-Heideggerian hermeneutics have lived such long and productive lives? Probably, though I've no clear sense of what that lesson may be.

When I wrote my tribute to Jacques Derrida last year, I aligned my sympathies with those thinkers--like Ricoeur, Gadamer, Charles Taylor, and others--who see the moral and political task implied by the phenomenological response to Nietzsche's critique of modernity as an "anthropological" rather than a "deconstructive" one. That is, I think that rather than, through the interplay of presumably always fallible human perspectives, intensifying (and thereby perhaps--hopefully?--discovering contingent ethical resources in the midst of) the now-exposed vacuum left by modernity's withdrawal from ontological commitments under the cover of science and rationalism, I take our task to be one of discerning the "strong," perhaps even metaphysical, underpinings of human subjectivity itself, and by so articulating them, concurrently investing them with greater force: as Nicholas H. Smith put it in Strong Hermeneutics, the aim is to "undertake a leveling up of identity-expressive, normative beliefs to cognitive status." This is done by shifting the philosophical project away from "the epistemological fragility of foundational truth claims" and towards the more anthropological "conditions of possibility of actual interpretive practices"--which means one's place in history and nature, of course, but most especially one's linguistic situatedness. Thinkers like Ricoeur believed phenomenology turned entirely upon "a recognition of the capacity of language to disclose domains of reality which constitute the independently subsisting subject matter of [all] competing interpretations" (SH, 1997, pgs. 19, 22, italics added). Depending on how one looks at it, this isn't so much a strengthening of subjectivity as it is an affirmation of purposefully "weak" revisions in traditional ontological thinking. But however the project is framed, it's a vital one, and Ricoeur's contributions to it have been enormous. Indeed, for many thinkers Ricoeur, far more than Gadamer, was the central figure in hermeneutic thinking--perhaps because Gadamer mostly grounded his thought in ancient Greek philosophy and applied his philosophy mostly to narrow issues in the humanities, whereas Ricoeur engaged those intellectual fields where questions of hermeneutics have been more broadly accepted: psychology, Biblical criticism, legal thought, moral philosophy, and much more.

Ricoeur was, and remains, one of those philosophers about whom I always told myself that I would eventually get around to reading more of. He's in good company in that category--I should also read more Rawls, Kierkegaard, Hume, etc.--but he stands apart from them nonetheless, if only because wrestling with Ricoeur's ideas about what it means to identify oneself through the articulation of a belief, and thereby come to an understanding of the normative purchase which a belief has upon oneself, is of tremendous importance to a religious believer like myself. Ricoeur made plain the stakes for modern believers nearly 40 years ago, in his powerful study of the problem of evil (or rather, the problem of understanding it), The Symbolism of Evil. After having exhaustively considered the way in which modern thought teaches us to comprehend everything as solely a cognitive marker, a symbol bereft of meaning besides that which we ascribe to it, and the way in which phenomenology can potentially return us to the actual meaning of symbols, Ricoeur concludes:

"In every way, something has been lost, irremediably lost: immediacy of belief. But if we can no longer live the great symbolisms of the sacred in accordance with the original belief in them, we can, we modern men, aim at a second naivete in and through criticism. In short, it is by interpreting that we can hear again . . . [T]he second immediacy that we seek and the second naivete that we await are no longer accessible to us anywhere else than in a hermeneutics; we can only believe by interpreting. . . . [Some interpretation merely means] to display the multiple and inexhaustible intentions of each symbol, to discover intentional analogies between myths and rites, to run through the levels of experience and representation that are unified by the symbol. . . . Our analysis of the symbols and myths of human evil belongs to that sort of understanding . . . But is has not been possible to limit ourselves to such understanding of symbols [by way of other] symbols. There the question of truth is unceasingly eluded. Although the phenomenologist may give the name of truth to the internal coherence, the systematicity, of the world of symbols, such truth is truth without belief, truth at a distance, reduced, from which one has expelled the question: do I believe that? . . . Such is the wager. Only he can object to this mode of thought who thinks that philosophy, to begin from itself, must be a philosophy without presuppositions [that is, something that disconnects cognition from worldhood, something that refuses to recognize the anthropological]. A philosophy that starts from the fullness of language is a philosophy with presuppositions [or, with meaningful conditions to its own subjectivity]. To be honest, it must make its presuppositions explicit, state them as beliefs, and try to make the wager pay off in understanding" (SE, 1967, pgs. 352-354, 357).

There are interesting puzzles which will probably always plague this kind of wagering on belief, perhaps the greatest being the degree to which one can clarify what it means for a presumably embedded self to nonetheless affirm the results of such a "naive," interpretive wager. Is critique possible; can we pull ourselves out of our environments and judge our own convictions? If it is not possible, what is the point of speaking of the value of individual belief at all? Our "beliefs," or lack of them, would therefore be entirely in the hands of fate, or God, or some kind of Pinkeresque evolutionary psychology. But if it is possible, then how can anyone ever be clear on the "limits" of said critique, the point at which our own sense of critical subjectivity runs up against an "independently subsisting subject matter" (about what is evil, for example, or immoral or unjust) that shows us where the "domains of reality" believably begin (and end)?

Ricoeur's hermeneutical response builds upon an evaluation of our consciousness of the origin of our own prejudices and their place in our lives as human beings, and thus clearly involves an "attestation of self," wherein the subject's personal (though dialogic) extension into the whole is understood as making it freer, wiser, further along. In this way hermeneutics, and the wagering that through interpretation we can discover that aspect of being which "still speaks to us," is almost wholly a matter of "self-interpretation," or an extension of Maurice Merleau-Ponty's "I can" to a higher level. Ricoeur was aware of the solipsism which such a reading of interpretation makes possible. In Oneself as Another, he wrote that "the discourse of the 'I can' is, to be sure, a discourse in I. But the main emphasis is to be placed on the verb, on being-able-to-do, to which corresponds on the ethical plane, being-able-to-judge" (OA, 1992, pg. 181). This willingness to speak of the subject in relation to what is teleologically manifest in our linguistic appropriation of the given makes it possible to think about human freedom, social transformation, and moral innovation in a way which Gadamer, whose communal vision was ultimately somewhat passive, cannot. But it also potentially weakens the idea of a teleological revelation of truth in the event of language--which seems to me to necessarily be the operating ontological presumption behind the strong yet "naive" hermeneutical and linguistic embrace of the details of one's own place in the world in the first place. Heidegger (in whose footsteps all these thinkers tread, more or less) insisted that the fullest expression of that which is manifest through attendance to the conditions of our own existence as subjects is simply that "being speaks." For Ricoeur (in whose writings one could always discern the glimmer of old Marxist hopes), that wasn't enough: it had to be a matter that "being can still speak to me." That is, we can be modern believers and meaningful actors simultaneously; the modern, critical self can still engage the foundations of the world without eschewing that very modernity which allows us to think about the symbols of that foundation as symbols themselves.

To thinkers still committed to the fundamentals of Enlightenment, and to those thinkers who take different lessons from Nietzsche and Heidegger, "naive" is a pitch-perfect description of the hermeneutic project. It is a label that Ricoeur, to his credit, attributed to his writings with pride. Paul Ricoeur, RIP.

Thursday, May 19, 2005

Thanks to everyone for the many great comments following my post on my plans to build a political theory class around movies this summer. Some excellent, as well as some truly off-the-wall, suggestions. Also, it was gratifying to see so many people willing to use Star Trek episodes to teach political ideas. But then, this is the internet, after all. Let me tell you where my thinking currently stands:

The class is going to open with a discussion of movies and politics, and specifically some of great examples of films inspiring, reflecting, or shaping political discourse in America; if possible, I'm going to show some clips from The Best Years of Our Lives (regarding America's postwar aspirations and supposed ideological "consensus") and Network (dealing, obviously, with the breakdown of such). (Thanks to David Salmanson for suggesting this idea.) Then I'm going to shift into talking about how we want to use films as texts to discuss basic political concepts, after which I'll introduce five general themes that we'll address through written material and films over the following few weeks (the class is only a month long). Right now the primary themes are going to be 1) democracy and populism; 2) violence and the law; 3) technology and freedom; 4) feminism and roles; and 5) conscience and consequentialism. That last is, obviously, a theme by no means particularly suited to a political theory class, but it allowed me to work in two of my favorite films, plus round out last few days of class. With all of these films I'll assign essays dealing with the films themselves, as well as some general political theory material.

In the first section, I plan on showing All the King's Men and Meet John Doe (though the latter will be difficult to get a copy of); the readings will focus on criticisms of democracy, both ancient and modern. Hopefully I'll be able to lead the class towards a discussion of present-day populist claims, as manifest both on the left and the right, maybe working in some stuff about California politics, the role of referendums and initiatives in a representative government, and (assuming it's still going on in July) the argument over filibusters in the U.S. Senate. The second section will include showings of Dirty Harry and The Man Who Shot Liberty Valence; of course, the readings there will be from Hobbes and other social contract thinkers, and include some references to contemporary failed states and arguments over sovereignty and international law. The third section is one I'm still puzzling over. I definitely will show Gattaca (it's a great, stylish movie), but I don't know what to pair it with. I'm going to assign readings on genetic engineering and some broader stuff on technology in general, but I can't for the life of me think of any movies that present the "struggle with progress" in anything other than triumph-of-the-human-spirit terms. Simply put, Hollywood doesn't make movies which complicate our dependence upon technology; it's essentially invisible until it starts to actually shape or constrain our dealings with each other, at which point it becomes an enemy to be defeated. (In short, Gattaca, while only about a 100 times better and more subtle, does not take an essentially different approach to the problem of technology than the mostly awful I, Robot.) Any suggestions? I'm strongly tempted to bring in one of my all-time favorites, Brazil, but I'm afraid it may be just too confusing to get any good use out of, and is probably too long to watch in one class period to boot.

For the fourth section, I'll show Thelma & Louise and Adam's Rib. There's a ton of good feminist writing on the former; not as much on the latter, but I decided to focus more narrowly on the way feminism presents different alternatives for dealing with male dominance of the public sphere, rather than looking at power and gender in general, and that meant I needed to find a film which talked about "compromises" between the sexes. Besides, Adam's Rib is the movie Melissa and I watched on our very first date, and so it occupies an important place in my heart for that reason. Finally, to end on a more explicitly ethical and philosophical note, I'll show them Crimes and Misdemeanors and A Man for All Seasons--the latter, of course, being one of the great (if not the greatest) existentialist celebrations of integrity and deontological morality, the former being a superb consideration of consequentialism and the possibility of morality in the absence of a (feared) moral order. I'm going to require them to write a short paper for each section of the class, and I think I may make use of blogging as well; maybe set up a blog for the class where I'll post my lectures and encourage them to continue our discussions outside the classroom.

One theme that I wish I could work into the class, but I just don't think they'll be time for it, is a consideration of borders, and the way in which identity and citizenship is constructed and used politically. That would have allowed me to use Spanglish, a film which I thought was simply fabulous, as well as Lone Star, a film I've never seen but which was praised and recommended by several of those who commented on the previous post. Weirdly though, I just finally got around to seeing the recut version of Orson Welles's Touch of Evil, which--besides being a terribly dated but still terrific noir film--I found to be filled with far more and far deeper echoes of and observations about life along the U.S.-Mexico border than I had remembered. A lot of ugly feeling is on stark display in that movie. I wonder if it and Lone Star would make an productive combination? I guess I'll have to rent the latter and find out.

Wednesday, May 18, 2005

Well, the word is getting around, so I may as well announce it here: I've accepted a position at Western Illinois University for the coming academic year. It's an offer we've accepted without fear, I think, but not without some doubt. That is, while we're committed to and happy about our decision, I'm not absolutely sure it's the "right" thing for us to do. After all, we could have stuck around Arkansas State for another year, to see how the search for a theorist in the department goes one more time, and enjoy the small but significant amount of comfort and security we've found for ourselves here in the meantime. As I said last month (and before), I've really come to feel at home here, to care a lot for the fine students and faculty I've been privileged to teach, work with, and learn from, and frankly my family and I (particularly Caitlyn, our five-year-old, whom we've come to realize over the last few days is taking our decision to move away from her friends pretty hard) really don't want to be in the position of chasing after a bunch of unknowns. Unfortunately, as of last month it became clear to us that there were far more unknowns still lurking around here than we had previously thought. So we looked around for some other one-year possibilities, to measure against the one remaining for us here. WIU made an offer, and after counting up all the respective arguments both for and against the move (which really just ended up just canceling each other out, as is so often the case), we came to the point that we simply needed to make a decision: stick with the unknowns immediately before us, or take a chance on those new unknowns over there? We made the latter choice, and feel good about it. Whether it'll get us what we want and hope for remains to be seen.

WIU is a state school located in Macomb, Illinois, about an hour from the Mississippi River, 90 minutes from both Peoria and Davenport, Iowa, and about 3 hours from both St. Louis and Chicago. It's farming country, like Jonesboro, but different crops: corn and soybean, mostly, rather than rice and cotton. It's small--about 20,000 people, and according to people I've spoken to in the department, that drops to under 10,000 during the summer. Melissa and I have done the small town thing before, but only in the South; we're looking forward to seeing how much different (or better, or worse) the experience will be in the Midwest. The move will take us back to the land of four full seasons, which is nice, and it will also get us closer to Melissa's family in Michigan. And I've liked, so far, the people I've spoken to there, and the classes I'm lined up to teach. Yes, I guess you could say we're excited--nervous, but excited. We've never lived in any one place longer than three years anyway (even our stay in Virginia while I was in graduate school was divided up by a couple of moves), so perhaps this is just some small part of our brains telling us it's time to move on. In any case, we'll be here until I complete my summer teaching (around the first week of August), and then we'll be Illinois bound. In the meantime, if any of you out there know anything that might help us prepare us for western Illinois--about parks, schools, food, weather, anything--we'd be delighted to hear from you.

Expect regular blogging--well, as regular as it ever gets around here--to resume tomorrow.

Monday, May 02, 2005

And so, on to the summer. (Assuming I survive the usual finals week grading chaos, that is.) Second summer term, starting in July, I'm going to be teaching a political theory course on politics and film; specifically, I want to look at certain movies which have advanced or contributed to political arguments and ideas, and use those films (and whatever writings and reviews I can dig up on them) as a springboard to discuss the arguments and ideas which they communicate or assume. In short, this isn't going to be a "the politics of film"-type of class. I'm not interested in forcing a bunch of political science students to acquaint themselves with film and cultural criticism so as to be able to interrogate what movies do (as worthy a subject as that is); rather, I'm looking to make use of what any number of movies have said about this or that political or social subject. (Is it just an excuse to watch movies in class during the summer and thus escape lecturing? Or is it an honest attempt to teach students how to respond intellectually to the art form they are most often exposed to anyway? Why, a little bit of both, of course.)

The primary question, of course, is what films to watch? We probably won't watch more than eight, because I want ample class time to remain for discussing various readings which I will assign both before and after we watch any given film. But I can't start collecting those readings until I'm sure what films to include. Ideally I'd like the class to be able to watch a couple of films in a row that bring up, in a significant way, a single theme or cluster of themes and would be able to serve more or less as antipodes to our discussion. Perhaps that won't be possible, but that's what I'm aiming for at the moment. So I'm looking for recommendations. Remember, what I'm searching for here are movies with some provocative political content or which lend themselves to conflicting political interpretations, not films that are, in themselves, political works (i.e., no Fahrenheit 9/11, no Triumph of the Will), or movies that are entirely about political events (i.e., no All the President's Men). And simply to make obtaining and viewing these movies easier, and to increase the likelihood that there are some good theoretical writing out there discussing them, I'd like to stick with major, English-language releases. So far the films I'm seriously thinking about including are Dirty Harry and The Man Who Shot Liberty Valence (addressing violence and the social contract); and Thelma & Louise and In the Company of Men (addressing gender and power), though I could probably be talked out of those. Other films I'd really like to include, but I'm uncertain about how to best frame them, are Meet John Doe, Gattaca, Crimes and Misdemeanors, and One Flew Over the Cuckoo's Nest. Any suggestions or criticisms would be welcome.

Quotes

"Every one of the standards according to which action is condemned demands action. Although the dignity of persons is inevitably violated in action, this dignity would be far less recognized in the world than it is had it not been supported by actions such as the establishment of constitutions and the fighting of wars in defense of human rights. Action must be untruthful, yet religion, science, philosophy, and the arts, the main forms of absolute fidelity to the truth, could not survive were they unsupported by action. Action cannot but be anticommunal in some measure, yet communal relationships would be almost nonexistent without areas of peace and order, which are created by action. We must act hesitantly and regretfully, then, but still we must act."

(Glenn Tinder, The Political Meaning of Christianity: The Prophetic Stance [HarperSanFrancisco, 1991], 215)

"[T]he press was still the last resource of the educated poor who could not be artists and would not be tutors. Any man who was fit for nothing else could write an editorial or a criticism....The press was an inferior pulpit; an anonymous schoolmaster; a cheap boarding-school; but it was still the nearest approach to a career for the literary survivor of a wrecked education."

"Mailer was a Left Conservative. So he had his own point of view. To himself he would suggest that he tried to think in the style of [Karl] Marx in order to attain certain values suggested by Edmund Burke."

(Norman Mailer, The Armies of the Night [The New American Library, 1968], 185)

"All those rely on their hands, and each is skillful at his own craft. / Without them a city would have no inhabitants; no settlers or travellers would come to it. / Yet they are not in demand at public discussions, nor do they attain to high office in the assembly. They do not sit on the judge's bench or understand the decisions of the courts. They cannot expound moral or legal principles and are not ready with maxims. / But they maintain the fabric of this world, and the practice of their craft is their prayer."

(Ecclesiasticus (Sirach) 38:31-34, in The Revised English Bible with the Apocrypha [Oxford University Press, 1989])

"The tendency, which is too common in these days, for young men to get a smattering of education and then think themselves unsuited for mechanical or other laborious pursuits is one that should not be allowed to grow up among us...Every one should make it a matter of pride to be a producer, and not a consumer alone."

(Wilford Woodruff, Millennial Star [November 14, 1887], 773)

"We are parts of the world; no one of us is an isolated world-whole. We are human beings, conceived in the body of a mother, and as we stepped into the larger world, we found ourselves immediately knotted to a universe with the thousand bands of our senses, our needs and our drives, from which no speculative reason can separate itself."

"'Business!' cried the Ghost, wringing its hands again. 'Mankind was my business. The common welfare was my business; charity, mercy, forbearance, and benevolence, were all my business. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!'"

(Charles Dickens, A Christmas Carol [Candlewick Press, 2006], 35)

"The Master said, 'At fifteen, I set my mind upon learning; at thirty, I took my place in society; at forty, I became free of doubts; at fifty, I understood Heaven's Mandate; at sixty, my ear was attuned; and at seventy, I could follow my heart's desires without overstepping the bounds of propriety.'"

"Lack of experience diminishes our power of taking a comprehensive view of the admitted facts. Hence those who dwell in intimate association with nature and its phenomena grow more and more able to formulate, as the foundations of their theories, principles which admit a wide and coherent development: while those whom devotion to abstract discussions has rendered unobservant of the facts are too ready to dogmatize on the basis of a few observations."

"[God] does not want men to give the Future their hearts, to place their treasure in it. . . . His ideal is a man who, having worked all day for the good of posterity (if that is his vocation), washes his mind of the whole subject, commits the issue to Heaven, and returns at once to the patience or gratitude demanded by the moment that is passing over him."

"Money is simply a tool. We use money as a proxy for our time and labor--our life energy--to acquire things that we cannot (or care not to) procure or produce with our own hands. Beyond that, it has limited actual utility: you can't eat it; if you bury it in the ground, it will not produce a crop to sustain a family; it would make a lousy roof and a poor blanket. To base our understanding of economy simply on money overlooks all other methods of exchange that can empower communities. Equating an economy only with money assumes there are no other means by which we can provide food for our bellies, a roof over our heads and clothing on our backs."

"A scholar's business is to add to what is known. That is all. But it is capable of giving the very greatest satisfaction, because knowledge is good. It does not have to look good or even sound good or even do good. It is good just by being knowledge. And the only thing that makes it knowledge is that it is true. You can't have too much of it and there is no little too little to be worth having. There is truth and falsehood in a comma."

"I believe in democracy. I accept it. I will faithfully serve and defend it. I believe in it because it appears to me the inevitable consequence of what has gone before it. Democracy asserts the fact the masses are now raised to a higher intelligence than formerly. All our civilization aims at this mark. We want to do what we can to help it. I myself want to see the result. I grant that it is an experiment, but it is the only direction society can take that is worth its taking; the only conception of its duty large enough to satisfy its instincts; the only result that is worth an effort or a risk. Every other possible step is backward, and I do not care to repeat the past. I am glad to see society grapple with issues in which no one can afford to be neutral."