Swarthmore Commencement Address -- Kwame Anthony Appiah

May 2006

Often I find myself seated next to a stranger on an airplane who asks me what I do. Sometimes I say I'm a philosopher. The commonest responses are:

An expression that combines boredom and alarm, and the end of the conversation (which leaves you with the pretzels and the soda, and the really fascinating article from the Review of Metaphysics you've been meaning to get to for a couple of years) and

"So, what's your philosophy?"

To that question, I usually reply: "Everything is much more complicated than you first thought." In philosophy at least, that really is my philosophy. So, I can tell the truth and we can both get back to those wonderfully inviting pretzels.

The truth, I said: I happen to be a great believer in objective truth. But one way in which things get more complicated than you thought is that I'm also a great believer in what philosophers call fallibilism. Fallibilism is the idea that our knowledge is imperfect, provisional, subject to revision in the face of new evidence. Fallibilism says: Here's what I know to a moral certainty, know well enough to live by. But I could be wrong.

Does that sound feeble? It isn't. Fallibilism made science possible. For centuries, doctrine had held that there were demonstrative, self-evident truths, as with a Euclidean theorem, and that all else was mere opinion. Today, scientific fallibilism has spread across the globe; it holds sway even in the theocracy of Iran, where scientific and technological knowledge is revered, especially, alas, when it produces new forms of weaponry.

And the expansion of human knowledge this planet has seen of late has been enormous and accelerating. When I was an undergraduate, my biochemistry tutor gave me a print of one of Carpaccio's great murals of St. Jerome. Years later, looking at the original in Venice, it struck me that the shelf of books behind the saint — his library — contained almost everything that he would have thought worth reading, and he would surely have read all of them. Now, a single web site may contain millions of times the number of pages that St. Jerome could have read in his lifetime. Once, to have heard the range of music that you can scan on your radio, you would have had to travel thousands of miles, hoping be on time for the right performances. Your DVD collections may well contain more hours of acting than Samuel Pepys — that devoted theatergoer — saw in his entire lifetime. There used to be a time when the proof of a new mathematical theorem was a rare and noteworthy event; and now? Something in the ballpark of seventy-five-thousand new mathematical theorems were proved just last year. (I have this on good authority, although, of course, I could be wrong.)

Intellectually, therefore, you will be the best-equipped class of graduating seniors in history, at least until those blighters from the Class of '07 roll around. So here's a conundrum for you. The past century has been an age of unprecedented scientific and scholarly mastery; it has also been an age of unprecedented bloodshed. Given everything our species has learned about historiography, literature, engineering, sociology, molecular biology, algebraic topology, and the rest, why haven't our ethical attainments kept pace? If we're so good at math, why haven't we become whizzes at morality?

You may take it as an extra-credit assignment, on which you'll be graded by posterity; I certainly can't pretend to have the answer. Still, I see some reasons to be hopeful. Three centuries ago, slavery was everywhere accepted as part of the natural order. No longer. The curtains of patriarchy — what John Stuart Mill called "the subjection of women" — have, at least, been parted, if not yet pulled back. Across much of the planet, we honor, in words, anyway, ideals such as liberty and equality and, not least, tolerance.

Yet tolerance, too, is more complicated than it looks. We can't suppose that mindless tolerance of cruelty and repression is a virtue. Yet how much evil is done by fanatics who can't countenance the possibility that their beliefs, sanctioned by ideological or religious authority, might conceivably be mistaken! Here, then, is one of the uncompleted tasks of our era: to spread fallibilism — not skepticism about the truth or indifference to it, but just the glimmering recognition that one may not be in full possession of it — from the empyrean of scientific fact to the hardpan of moral conviction: to make it as common as Coca-Cola. People say that common sense is the ability to see what's in front of your eyes. But even madmen and extremists can see what's in front of their eyes; so, again, I think it's more complicated than that. Common sense, I'd prefer to say, involves the ability to see what's in front of the other fellow's eyes. That's what makes it something we might have in common.

When you graduate from Swarthmore, whatever you pursue, you'll surely, in myriad ways, add to the relentless expansion of human knowledge. As you do so, I suggest only that, by way of common sense, you see what contributions you can make to that other great, collective project of moral understanding. Indeed, I have every confidence that your contributions will be substantial... though, of course, I could turn out to be wrong.