Wednesday, May 4, 2011

A book arguing for the power of the concept of cognitive dissonance to explain “why we justify foolish beliefs, bad decisions, and hurtful acts” lacks one thing: a defensible explanation of what cognitive dissonance is.

The following is not a review but merely a comment on one particular point in the book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson,1 namely its failure to explain the concept and the associated theory that are the central theme of its argument. I ought perhaps to mention at this point, since you might think otherwise upon reading what follows, that I found the book immensely instructive and disturbing in a potentially very salutary way. Its strength lies in its description and analysis of the various ways in which our need to feel justified in what we think, say, and do drives us to think, say, and do wrong and harmful things. Its weakness lies in its failure to explain the rubric under which it does this work of description and analysis, the concept of cognitive dissonance.

* * *

Before
I read this book, I was acquainted with the term “cognitive dissonance” but had only a rather vague notion of what it means. Having
read the book, I have a better idea of what it means, and of the
psychological research that is associated with it; but the book
contains no satisfactory explanation either of what cognitive
dissonance is or what cognitive dissonance theory is. The authors
repeatedly say that cognitive dissonance theory predicts this and
cognitive dissonance theory predicts that, but they never tell us what
the theory is—an omission that diminishes not only the usefulness of
their book but also the credibility of their argument. We cannot make any informed judgment of the value of the theory if we are never told what it is, but told only of its alleged predictive successes.

Aronson and Tavris offer an explanation of the term “cognitive
dissonance” at one point; but it is quite inadequate. It occurs just after an account of the researches of social psychologist Leon Festinger and his collaborators on the response of the followers of a pretended seer, one Marian Keech, to the failure of her prophecy that on a certain date a spaceship would come to rescue them before the earth would be destroyed.2 One might suppose, if one has not previously observed how the adherents of such prophecies behave when confronted with the failure of them, that the followers would be disillusioned and see that their faith in Mrs. Keech was misplaced. But Festinger, the authors report, made a more nuanced, specific, and, as it transpired, more accurate prediction:

The believers who had not made a strong commitment to the prophecy—who awaited the end of the world by themselves at home, hoping they weren’t going to die at midnight—would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and were waiting with the others for the spaceship would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them. (12)

At the end, the authors observe, “Mrs. Keech’s prediction had failed, but not Leon Festinger’s.” They then move on to the theory to which they credit this prediction—the theory of cognitive dissonance. They write:

The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is an unpleasant feeling that Festinger called “cognitive dissonance.” Cognitive
dissonance is a state of tension that occurs whenever a person holds
two cognitions (ideas, attitudes, beliefs, opinions) that are
psychologically inconsistent, such as “Smoking is a dumb thing to do
because it could kill me” and “I smoke two packs a day.” Dissonance
produces mental discomfort, ranging from minor pangs to deep anguish;
people don’t rest easy until they find a way to reduce it. In this
example, the most direct way for a smoker to reduce dissonance is by
quitting. But if she has tried to quit and failed, now she must reduce
dissonance by convincing herself that smoking isn’t really so harmful,
or that smoking is worth the risk because it helps her relax or
prevents her from gaining weight (and after all, obesity is a health
risk, too), and so on. Most smokers manage to reduce dissonance in many
such ingenious, if self-deluding, ways. (13)

The
authors cite the pair of thoughts “Smoking is a dumb thing to do
because it could kill me” and “I smoke two packs a day” as an example
of “two cognitions that are psychologically inconsistent.” But is there any inconsistency at all between these two
thoughts? Certainly they are not logically
inconsistent: it is possible for both to be true. Nor is there any kind
of probabilistic conflict between the two: it does not defy probability
that both should be true. The authors say, in the paragraph immediately
following the one just quoted, “Dissonance is disquieting because to
hold two ideas that contradict each other is to flirt with absurdity
. . .” But there is no contradiction between the two cognitions
in the example.

The authors say that the two cognitions are psychologically
inconsistent. But what is that supposed to mean? That no one can affirm
both thoughts at the same time? But surely people can do so; if they
could not, then this pair of cognitions could not be an example of
cognitive dissonance! Wherein, then, is the “psychological
inconsistency” supposed to consist? Perhaps in the fact that affirming
both thoughts creates discomfort? But the discomfort was supposed to be
the effect of a so-called
psychological inconsistency. If the so-called inconsistency is nothing
other than the discomfort itself, then the definition amounts to saying
that psychological dissonance is the
state of tension that occurs whenever a person holds two cognitions
that produce a state of tension—which tells us essentially
nothing.

It
is a dismal failing for a book to give no satisfactory explanation of the very concept that is at the core of its argument. We
are left to figure out for ourselves what the concept is from the
evidence of the use that the authors make of it.

One point
about the concept that is clear is that it has an immediate bearing on the common human proclivity for self-justification. It is, in fact, supposed to provide the answer to the question implied by the book’s subtitle: “why we justify foolish beliefs, bad
decisions, and hurtful acts.” We justify, or attempt to justify, such
things because it is difficult for us to accept that our beliefs have
been foolish, our decisions bad, or our acts hurtful. It is surely these negative evaluations of
ourselves
that are the source of the discomfort of which the authors speak. In
the example quoted above, there is, as I said earlier, no inconsistency
between the thoughts “Smoking is a dumb thing to do because it could
kill me” and “I smoke two packs a day”; but the combination of those
thoughts entails the thought “I do a dumb thing.” That implication, and
not any inconsistency between the first two thoughts, is the source of
our discomfort. To reduce dissonance, we must do things, or rather
think things, that will allow us to avoid accepting that conclusion.

It
seems to me that all of the examples discussed by the authors fit under
this explanation of the concept better than they fit under the
explanation that they give. Marian Keech could not give up the idea that she had visionary powers because she had built so much of her understanding and evaluation of herself upon that idea. Her most devoted followers could not give up that idea precisely because they had devoted themselves to her in quite costly ways: to admit that their faith in her was misplaced would be to admit that they had been extravagantly foolish.

Further, it is evident that many cases that
fit under the authors’ definition will not illustrate what they mean by
cognitive dissonance. Suppose, for instance, that I remember
distinctly, or seem to remember distinctly, leaving a book in a certain
place a short time ago, but that when I return to that place, I don’t
find the book there (and suppose also that I am alone in my room when
this has gone on). This may cause me perplexity, consternation,
irritation, frustration, and other unpleasant emotions, but it will not give rise to what Aronson and Tavris seem to have in mind when they use
the term “cognitive dissonance.” Certainly it will not drive me to try
to explain the non-appearance of the book in self-justifying ways.
Rather, my reaction will most likely be first to look around to see if
the book has fallen down somewhere, and then, if that does not lead to
the discovery of it, to conclude that my memory is at fault: I must
have put the book somewhere else and forgotten doing so. Yet here we
clearly have a case of discomfort produced by an inconsistent pair of
cognitions—“I left the book right here (and no one else has been
around to move it)” and “The book is not here.” There is no cognitive
dissonance involved because the conflict between these two cognitions
does not, or does not seriously, threaten my evaluation of myself. It
does compel me to acknowledge the faultiness of my memory, but it will
not be the first thing to have done that.

In sum, what the
authors talk about under the heading of “cognitive dissonance” is not,
as they say in their attempt at a definition of the term, an
inconsistency between two cognitions, but an inconsistency between some
body of cognitions and our estimation of ourselves.

After writing the comment above, I came across the following passage in the Wikipedia article “Cognitive Dissonance”:

An overarching principle of cognitive dissonance is that it involves the formation of an idea or emotion in conflict with a fundamental element of the self-concept, such as “I am a successful/functional person,” “I am a good person,” or “I made the right decision.”

I wish that I had a better source for the attribution of this principle to the concept or the theory of cognitive dissonance than Wikipedia, but as far as it goes, it confirms the argument that I developed independently. What puzzles me is that something so obviously important would fail to make its way into the argument of Mistakes Were Made. Elliot Aronson, also according to Wikipedia (the article on him), “is listed among the 100 most eminent psychologists of the 20th Century,” “is the only person in the 120-year history of the American Psychological Association to have won all three of its major awards: for writing, for teaching, and for research,“ and “in 2007 . . . received the William James Award for Lifetime Achievement from the Association for Psychological Science.” Why he and Carol Tavris failed to include this essential point in their exposition—which is virtually a non-exposition—of the central concept of their book, I do not know, but the fact that they did so confirms my suspicion that sloppiness in the handling of crucial concepts is very common in the discipline of psychology.

2Leon Festinger, Henry W. Riecken, and Stanley Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the End of the World (Minneapolis: University of Minnesota Press, 1956).