Meta

As usual, I was engrossed in some rambling rant about a book I was reading—no doubt enlarging upon the author’s marvelous intellect (and, by association, my own). My poor friend, who is by now used to this sort of thing, suddenly asked me:

“Do you really think reading all these books has made you a better person?”

“Well, yeah…” I stuttered. “I think so…”

An awkward silence took over. I could truthfully say that reading had improved my mind, but that wasn’t the question. Was I better? Was I more wise, more moral, calmer, braver, kinder? Had reading made me a more sympathetic friend, a more caring partner? I didn’t want to admit it, but the answer seemed to be no.

This wasn’t an easy thing to face up to. My reading was a big part of my ego. I was immensely proud, indeed even arrogant, about all the big books I’d gotten through. Self-study had strengthened a sense of superiority.

But now I was confronted with the fact that, however much more knowledgeable and clever I had become, I had no claim to superiority. In fact—although I hated even to consider the possibility—reading could have made me worse in some ways, by giving me a justification for being arrogant.

This phenomenon is by no means confined to myself. Arrogance, condescension, and pretentiousness are ubiquitous qualities in intellectual circles. I know this both at first- and second-hand. While lip-service is often given to humility, the intellectual world is rife with egotism. And often I find that the more well-educated someone is, the more likely they are to assume a condescending tone.

This is the same condescending tone that I sometimes found myself using in conversations with friends. But condescension is of course more than a tone; it is an attitude towards oneself and the world. And this attitude can be fostered and reinforced by habits you pick up through intellectual activity.

One of these habits is argumentativeness for me, most closely connected with reading philosophy. Philosophy is, among other things, the art of argument; and good philosophers are able to bring to their arguments a level of rigor, clarity, and precision that is truly impressive. The irony here is that there is far more disagreement in philosophy than in any other discipline. To be fair, this is largely due to the abstract, mysterious, and often paradoxical nature of the questions they investigate—which resist even the most thorough analysis.

Nevertheless, given that their professional success depends upon putting forward the strongest argument to a given problem, philosophers devote a lot of time to picking apart the theories and ideas of their competitors. Indeed, the demolition of a rival point of view can assume supreme importance. A good example of this is Gilbert Ryle’s Concept of Mind—a brilliant and valuable book, but one that is mainly devoted to debunking an old theory rather than putting forward a new one.

This sort of thing isn’t confined to philosophy, of course. I have met academics in many disciplines whose explicit goal is to quash another theory rather than to provide a new one. I can sympathize with this, since proving an opponent wrong can feel immensely powerful. To find a logical fallacy, an unwarranted assumption, an ambiguous term, an incorrect generalization in a competitor’s work, and then to focus all your firepower on this structural weakness until the entire argument comes tumbling down—it’s really satisfying. Intellectual arguments can have all the thrill of combat, with none of the safety hazards.

But to steal a phrase from the historian Richard Fletcher, disputes of this kind usually generate more heat than light. Disproving a rival claim is not the same thing as proving your own claim. And when priority is given to finding the weaknesses rather than the strengths of competing theories, the result is bickering rather than the pursuit of truth.

To speak from my own experience, in the past I’ve gotten to the point where I considered it a sign of weakness to agree with somebody. Endorsing someone else’s conclusions without reservations or qualifications was just spineless. And to fail to find the flaws in another thinker’s argument—or, worse yet, to put forward your own flawed argument—was simply mortifying for me, a personal failing. Needless to say this mentality is not desirable or productive, either personally or intellectually.

Besides being argumentative, another condescending attitude that intellectual work can reinforce is name-dropping.

In any intellectual field, certain thinkers reign supreme. Their theories, books, and even their names carry a certain amount of authority; and this authority can be commandeered by secondary figures through name-dropping. This is more than simply repeating a famous person’s name (although that’s common); it involves positioning oneself as an authority on that person’s work.

Two books I read recently—Mortimer Adler’s How to Read a Book, and Harold Bloom’s The Western Canon—are prime examples of this. Both authors wield the names of famous authors like weapons. Shakespeare, Plato, and Newton are bandied about, used to cudgel enemies and to cow readers into submission. References to famous thinkers and writers can even be used as substitutes for real argument. This is the infamous argument from authority, a fallacy easy to spot when explicit, but much harder when used in the hands of a skilled name-dropper.

I have certainly been guilty of this. Even while I was still an undergraduate, I realized that big names have big power. If I even mentioned the names of Dante or Milton, Galileo or Darwin, Hume or Kant, I instantly gained intellectual clout. And if I found a way to connect the topic under discussion to any famous thinker’s ideas—even if that connection was tenuous and forced—it gave my opinions weight and made me seem more “serious.” Of course I wasn’t doing this intentionally to be condescending or lazy. At the time, I thought that name-dropping was the mark of a dedicated student, and perhaps to a certain extent it is. But there is a difference between appropriately citing an authority’s work and using their work to intimidate people.

There is a third way that intellectual work can lead to condescending attitudes, and that is, for lack of a better term, political posturing. This particular attitude isn’t very tempting for me, since I am by nature not very political, but this habit of mind is extremely common nowadays.

By political posturing I mean several related things. Most broadly, I mean when someone feels that people (himself included) must hold certain beliefs in order to be acceptable. These can be political or social beliefs, but they can also be more abstract, theoretical beliefs. In any group—be it a university department, a political party, or just a bunch of friends—a certain amount of groupthink is always a risk. Certain attitudes and opinions become associated with the group, and they become a marker of identity. In intellectual life this is a special hazard because proclaiming fashionable and admirable opinions can replace the pursuit of truth as the criterion of acceptability.

At its most extreme, this kind of political posturing can lead to a kind of gang mentality, wherein disagreement is seen as evil and all dissent must be punished with ostracism and mob justice. This can be observed in the Twitter shame campaigns of recent years, but a similar thing happens in intellectual circles.

During my brief time in graduate school, I felt an intense and ceaseless pressure to espouse leftist opinions. This seemed to be ubiquitous: students and professors sparred with one another, in person and in print, by trying to prove that their rival is not genuinely right-thinking (or “left-thinking” as the case may be). Certain thinkers could not be seriously discussed, much less endorsed, because their works had intolerable political ramifications. Contrariwise, questioning the conclusions of properly left-thinking people could leave you vulnerable to accusations about your fidelity to social justice or economic equality.

But political posturing has a milder form: know-betterism. Know-betterism is political posturing without the moral outrage, and its victims are smug rather than indignant.

The book Language, Truth, and Logic by A.J. Ayer comes to mind, wherein the young philosopher, still in his mid-twenties, simply dismisses the work of Plato, Aristotle, Spinoza, Kant and others as hogwash, because it doesn’t fit into his logical positivist framework.

Indeed, logical positivism is an excellent example of the pernicious effects of know-betterism. In retrospect, it seems incredible that so many brilliant people endorsed it, because logical positivism has crippling and obvious flaws. But not only did people believe it, but they thought it was “The Answer”—the solution to every philosophical problem—and considered anyone who thought otherwise a crank or a fool, somebody who couldn’t see the obvious. This is the danger of groupthink: when everyone “in the know” believes something, it can seem obviously right, regardless of the strength of the ideas.

The last condescending attitude I want to mention is rightness—the obsession with being right. Now of course there’s nothing wrong with being right. Getting nearer to the truth is the goal of all honest intellectual work. But to be overly preoccupied with being right is, I think, both an intellectual and a personal shortcoming.

As far as I know, the only area of knowledge in which real certainty is possible is mathematics. The rest of life is riddled with uncertainty. Every scientific theory might, and probably will, be overturned by a better theory. Every historical treatise is open to revision when new evidence, priorities, and perspectives arise. Philosophical positions are notoriously difficult to prove, and new refinements are always around the corner. And despite the best efforts of the social sciences, the human animal remains a perpetually surprising mystery.

To me, this uncertainty in our knowledge means that you must always be open to the possibility that you are wrong. The feeling of certainty is just that—a feeling. Our most unshakeable beliefs are always open to refutation. But when you have read widely on a topic, studied it deeply, thought it through thoroughly, it gets more and more difficult to believe that you are possibly in error. Because so much effort, thought, and time has gone into a conclusion, it can be personally devastating to think that you are mistaken.

This is human, and understandable, but can also clearly lead to egotism. For many thinkers, it becomes their goal in life to impose their conclusions upon the world. They struggle valiantly for the acceptance of their opinions, and grow resentful and bitter when people disagree with or, worse, ignore them. Every exchange thus becomes a struggle, pushing your views down another person’s throat.

This is not only an intellectual shortcoming—since it is highly unlikely that your views represent the whole truth—but it is also a personal shortcoming, since it makes you deaf to other people’s perspectives. When you are sure you’re right, you can’t listen to others. But everyone has their own truth. I don’t mean that every opinion is equally valid (since there are such things as uninformed opinions), but that every opinion is an expression, not only of thoughts, but of emotions, and emotions can’t be false.

If you want to have a conversation with somebody instead of giving them a lecture, you need to believe that they have something valuable to contribute, even if they are disagreeing with you. In my experience it is always better, personally and intellectually, to try to find some truth in what someone is saying than to search for what is untrue.

Lastly, being overly concerned with being right can make you intellectually timid. Going out on a limb, disagreeing with the crowd, putting forward your own idea—all this puts you at risk of being publicly wrong, and thus will be avoided out of fear. This is a shame. The greatest adventure you can take in life and thought is to be extravagantly wrong. Name any famous thinker, and you will be naming one of the most gloriously incorrect thinkers in history. Newton, Darwin, Einstein—every one of them has been wrong about something.

For a long time I have been the victim of all of these mentalities—argumentativeness, name-dropping, political posturing, know-betterism, and rightness—and to a certain extent, probably I always will. What makes them so easy to fall into is that they are positive attitudes taken to excess. It is admirable and good to subject claims to logical scrutiny, to read and cite major authorities, to advocate for causes you think are right, to respect the opinions of your peers and colleagues, and to prioritize getting to the truth.

But taken to excesses, these habits can lead to egotism. They certainly have with me. This is not a matter of simple vanity. Not only can egotism cut you off from real intimacy with other people, but it can lead to real unhappiness, too.

When you base your self-worth on beating other people in argument, being more well read than your peers, being on the morally right side, being in the know, being right and proving others wrong, then you put yourself at risk of having your self-worth undermined. To be refuted will be mortifying, to be questioned will be infuriating, to be contradicted will be intolerable. Simply put, such an attitude will put you at war with others, making you defensive and quick-tempered.

An image that springs to mind is of a giant castle with towering walls, a moat, and a drawbridge. On the inside of this castle, in the deepest chambers of the inner citadel, is your ego. The fortifications around your ego are your intellectual defenses—your skill in rhetoric, logic, argument, debate, and your impressive knowledge. All of these defense are necessary because your sense of self-worth depends on certain conditions: being perceived, and perceiving oneself, as clever, correct, well-educated, and morally admirable.

Intimacy is difficult in these circumstances. You let down the drawbridge for people you trust, and let them inside the walls. But you test people for a long time before you get to this point—making sure they appreciate your mind and respect your opinions—and even then, you don’t let them come into the inner citadel. You don’t let yourself be totally vulnerable, because even a passing remark can lead to crippling self-doubt when you equate your worth with your intellect.

Thus the fundamental mindset that leads to all of the bad habits described above is that being smart, right, or knowledgeable is the source of your worth as a human being. This is dangerous, because it means that you constantly have to reinforce the idea that you have all of these qualities in abundance. Life becomes then a constantly performance, an act for others and for yourself. And because a part of you knows that its an act—a voice you try to ignore—then it also leads to considerable bad faith.

As for the solution, I can only speak from my own experience. The trick, I’ve found, is to let down my guard. Every time you defend yourself you make yourself more fragile, because you tell yourself that there is a part of you that needs to be defended. When you let go of your anxieties about being wrong, being ignorant, or being rejected, your intellectual life will be enriched. You will find it easier to learn from others, to consider issues from multiple points of view, and to propose original solutions.

Thus I can say that reading has made me a better person, not because I think intellectual people are worth more than non-intellectuals, but because I realized that they aren’t.

Great conclusion. I breathed a sigh of relief. Reading can make someone either a better or worse person depending on what “better” means. And, I guess, depending on what one reads. It can prompt someone to take on broader empathy for others and it can lead to no empathy at all, as you’ve explained. And even if a lot of reading correlates with a particular human virtue or vice, it’s fair to remind ourselves that correlation is not causation.

Well, I was speaking to a friend of mine who studies math professionally, and he told me that usually, once a proof has been thoroughly reviewed, it is not overturned or refuted. But theories are constantly overturned in science, history, and philosophy. Does it seem like an assertion in need of more qualification?

From my reading, the objectivity of mathematics is a contentious issue; particularly in the context of reading the philosophy of mathematics. In my experience, which is by no means a reliable indicator in any case, the majority of mathematicians seem very for the objectivity but there are some contrarians. E.g. Wittgenstein, who might compare it to the objectivity of, say, chess.

I don’t want to risk insulting your friend but I don’t understand his/her relevance unless what you’re saying is that you have previously taken your friend’s opinion as gospel, which I would understand.

That said, you could simply be referring to mathematics as unquestionable in the same sense that the rules of chequers are unquestionable. In a philosophical sense, I suppose I’m interested in whether you see mathematics as a system that is discovered or created.

That’s an interesting question! Honestly I haven’t made up my mind about the metaphysical status of mathematics, although normally I incline towards Wittgenstein’s view.

All I meant by “certainty” is that, within the bounds of their axioms, mathematical proofs are unquestionable. My friend confirmed this by telling me that mathematical innovation doesn’t consist of overturning older work, like science sometimes does, but builds on it. He didn’t have any solid opinion about the ontological status of mathematical truths.