Before 2014, catch up on the best of The New Republic. For the next few weeks, we'll be re-posting a
selection of our most thought-provoking pieces from the recent past.

There are two cathedrals in Coventry. The newer one, consecrated on May 25, 1962, stands beside
the remains of the older one, which dates from the fourteenth century, a ruin testifying to the
bombardment of the Blitz. Three years before the consecration, in one of the earliest ventures in
the twinning of towns, Coventry had paired itself with Dresden. That gesture of reconciliation
was recapitulated in 1962, when Benjamin Britten’s War Requiem received its first performance at
the ceremony. The three soloists were an English tenor (Peter Pears), a German baritone (Dietrich
Fischer-Dieskau), and a Russian soprano (Heather Harper).

Since the 1960s, historians have worked—and debated—to bring into focus the events of the night
of February 13, 1945, in which an Allied bombing attack devastated the strategically irrelevant city
of Dresden. An increased understanding of the decisions that led to the fire-bombing, and of the
composition of the Dresden population that suffered the consequences, have altered subsequent
judgments about the conduct of war. The critical light of history has been reflected in the
contributions of novelists and critics, and of theorists of human rights. Social and political
changes, in other words, followed the results of humanistic inquiry, and were intertwined with
the reconciliatory efforts of the citizens of Coventry and Dresden. Even music and poetry played
roles in this process: what history has taught us is reinforced by the lines from Wilfred Owen that
Britten chose as the epigraph for his score—“My subject is war, and the pity of war. The poetry is
in the pity. All a poet can do today is warn.” It is so easy to underrate the impact of the humanities

The Trouble with Scientism
Why history and the humanities are also a form of knowledge

and of the arts. Too many people, some of whom should know better, do it all the time. But
understanding why the natural sciences are regarded as the gold standard for human knowledge
is not hard. When molecular biologists are able to insert fragments of DNA into bacteria and turn
the organisms into factories for churning out medically valuable substances, and when
fundamental physics can predict the results of experiments with a precision comparable to
measuring the distance across North America to within the thickness of a human hair, their
achievements compel respect, and even awe. To derive one’s notion of human knowledge from
the most striking accomplishments of the natural sciences easily generates a conviction that other
forms of inquiry simply do not measure up. Their accomplishments can come to seem inferior,
even worthless, at least until the day when these domains are absorbed within the scope of “real
science.”

The conflict between the Naturwissenschaften and the Geisteswissenschaften goes back at least two
centuries, and became intensified as ambitious, sometimes impatient researchers proposed to
introduce natural scientific concepts and methods into the study of human psychology and
human social behavior. Their efforts, and the attitudes of unconcealed disdain that often inspired
them, prompted a reaction, from Vico to Dilthey and into our own time: the insistence that some
questions are beyond the scope of natural scientific inquiry, too large, too complex, too imprecise,
and too important to be addressed by blundering over-simplifications. From the nineteenth-
century ventures in mechanistic psychology to contemporary attempts to introduce evolutionary
concepts into the social sciences, “scientism” has been criticized for its “mutilation”
(Verstümmelung, in Dilthey’s memorable term) of the phenomena to be explained.

The problem with scientism—which is of course not the same thing as science—is owed to a
number of sources, and they deserve critical scrutiny. The enthusiasm for natural scientific
imperialism rests on five observations. First, there is the sense that the humanities and social
sciences are doomed to deliver a seemingly directionless sequence of theories and explanations,
with no promise of additive progress. Second, there is the contrasting record of extraordinary
success in some areas of natural science. Third, there is the explicit articulation of technique and
method in the natural sciences, which fosters the conviction that natural scientists are able to
acquire and combine evidence in particularly rigorous ways. Fourth, there is the perception that
humanists and social scientists are only able to reason cogently when they confine themselves to
conclusions of limited generality: insofar as they aim at significant—general—conclusions, their
methods and their evidence are unrigorous. Finally, there is the commonplace perception that the
humanities and social sciences have been dominated, for long periods of their histories, by
spectacularly false theories, grand doctrines that enjoy enormous popularity until fashion
changes, as their glaring shortcomings are disclosed.

These familiar observations have the unfortunate effect of transforming differences of degree into
differences of kind, as enthusiasts for the alleged superiority of natural science readily succumb to
stereotypes and over-generalizations, without regard for more subtle explanations. Let us
consider the five foundations of this mistake in order.

The most obvious explanation for the difficulties of the Geisteswissenschaften, the humanities and
the study of history and society, is that they deal with highly complex systems. Concrete results
are often achieved in particular instances: historians and anthropologists are able to be precise
and accurate by sacrificing generality, by clear-headedly disavowing the attempt to provide any
grand overarching theory. No large vision of history emerges from our clearer understanding of
the bombing of Dresden, but the details are no less powerful and significant. In this respect,
moreover, matters are no different in the natural sciences. As we shall see, science often forgoes
generality to achieve a precise and accurate answer to an important question.

In English we speak about science in the singular, but both French and German wisely retain the
plural. The enterprises that we lump together are remarkably various in their methods, and also in
the extent of their successes. The achievements of molecular engineering or of measurements
derived from quantum theory do not hold across all of biology, or chemistry, or even physics.
Geophysicists struggle to arrive at precise predictions of the risks of earthquakes in particular
localities and regions. The difficulties of intervention and prediction are even more vivid in the
case of contemporary climate science: although it should be uncontroversial that the Earth’s mean
temperature is increasing, and that the warming trend is caused by human activities, and that a
lower bound for the rise in temperature by 2200 (even if immediate action is taken) is two
degrees Celsius, and that the frequency of extreme weather events will continue to rise,
climatology can still issue no accurate predictions about the full range of effects on the various
regions of the world. Numerous factors influence the interaction of the modifications of climate
with patterns of wind and weather, and this complicates enormously the prediction of which
regions will suffer drought, which agricultural sites will be disrupted, what new patterns of
disease transmission will emerge, and a lot of other potential consequences about which we might
want advance knowledge. (The most successful sciences are those lucky enough to study systems
that are relatively simple and orderly. James Clerk Maxwell rightly commented that Galileo would
not have redirected the physics of motion if he had begun with turbulence rather than with free
fall in a vacuum.)

The emphasis on generality inspires scientific imperialism, conjuring a vision of a completely
unified future science, encapsulated in a “theory of everything.” Organisms are aggregates of
cells, cells are dynamic molecular systems, the molecules are composed of atoms, which in their

turn decompose into fermions and bosons (or maybe into quarks or even strings). From these
facts it is tempting to infer that all phenomena—including human actions and interaction—can
“in principle” be understood ultimately in the language of physics, although for the moment we
might settle for biology or neuroscience. This is a great temptation. We should resist it. Even if a
process is constituted by the movements of a large number of constituent parts, this does not
mean that it can be adequately explained by tracing those motions.

A tale from the history of human biology brings out the point. John Arbuthnot, an eighteenth-
century British physician, noted a fact that greatly surprised him. Studying the registry of births
in London between 1629 and 1710, he found that all of the years he reviewed showed a
preponderance of male births: in his terms, each year was a “male year.” If you were a mad
devotee of mechanistic analysis, you might think of explaining this—“in principle”—by tracing the
motions of individual cells, first sperm and eggs, then parts of growing embryos, and showing
how the maleness of each year was produced. But there is a better explanation, one that shows the
record to be no accident. Evolutionary theory predicts that for many, but not all, species, the
equilibrium sex-ratio will be 1:1 at sexual maturity. If it deviates, natural selection will favor the
underrepresented sex: if boys are less common, invest in sons and you are likely to have more
grandchildren. This means that if one sex is more likely to die before reaching reproductive age,
more of that sex will have to be produced to start with. Since human males are the weaker sex—
that is, they are more likely to die between birth and puberty—reproduction is biased in their
favor.

The idea of a “theory of everything” is an absurd fantasy. Successful sciences are collections of
models of different types of phenomena within their domains. The lucky ones can generate
models that meet three desiderata: they are general, they are precise, they are accurate. Lots of
sciences, natural sciences, are not so fortunate. As the ecologist Richard Levins pointed out
decades ago, in many areas of biology—and, he might have added, in parts of physics, chemistry,
and earth and atmospheric science as well—the good news is that you can satisfy any two of these
desiderata, but at the cost of sacrificing the third. Contemporary climatology often settles for
generality and accuracy without precision; ecologists focusing on particular species provide
precise and accurate models that prove hard to generalize; and of course if you abandon accuracy,
precision and generality are no problem at all.

Let us turn now to the celebration of scientific rigor. Individual sciences develop rules and
standards for appraising evidence—as they learn about aspects of nature, they learn more about
how to learn. At any particular stage of inquiry, communities of scientists agree on the canons of
good inference, so that the work of certification of new results goes relatively smoothly. To the

extent that the agreed-on rules are reliable, knowledge accumulates. It is important to
understand, however, that at times of major change the standards of good science themselves are
subject to question and discussion. And this observation, amply demonstrated in the history of
the sciences, has important consequences.

The contrast between the methods of the two realms, which seems so damning to the humanities,
is a false one. Not only are the methods deployed within humanistic domains—say, in attributions
of musical scores to particular composers or of pictures to particular artists—as sophisticated and
rigorous as the techniques deployed by paleontologists or biochemists, but in many instances they
are the same. The historical linguists who recognize connections among languages or within a
language at different times, and the religious scholars who point out the affiliations among
different texts, use methods equivalent to those that have been deployed ever since Darwin in the
study of the history of life. Indeed, Darwin’s paleontology borrowed the method from early
nineteenth-century studies of the history of languages.

By the same token, the sense that the humanities are dominated by changes in fashion among
spectacularly false theories suggests a contrast where none is to be found. If Marx and Freud are
favorite whipping boys for those worried about the Geisteswissenschaften, the compliment is
readily returned. Not only did behaviorist psychology—itself motivated by the desire to make
studies of human conduct “truly scientific”—dominate much of twentieth-century social science,
but its influence was foreshadowed within nineteenth-century physics and chemistry with the
proliferation of “ether theories.” No less a figure than Maxwell even characterized the ether as
“the best confirmed entity in natural philosophy” (by which he, like his contemporaries, meant
“natural science”).

So the five points of scientism rest on stereotypes, and these are reinforced by the perception of
threats. As the budgets for humanities departments shrink, humanists see natural scientists
blundering where the truly wise fear to tread. Conversely, scientists whose projects fail to win
public approval seem to envision what John Dupré has called the “Attack of the Fifty-Foot
Humanist,” a fantasy akin to supposing that post-modernist manifestos are routinely distributed
with government briefing books. We need to move beyond the stereotypes and discard the absurd
visions that often maintain them.

To develop a better view, let us focus for the moment on history and anthropology (although the
conclusions I shall defend would also apply to literature and the arts, as well as to critical studies
of them). While it is true that rigorous history and ethnography often give up generality for
accuracy and precision, their conclusions can nonetheless have considerable importance.

Scientific significance is not limited to the discovery of general laws—that idea is a hangover from
an age in which the scientific task was seen as one of fathoming the Creator’s rulebook, of
thinking “God’s thoughts after Him.” The sciences, recall, are collections of models, directed at
answering questions. Not every question matters: there are countless issues about the variation of
your physical environment while you read these sentences that should be of no concern to
anybody. Generality is to be prized, partly because it is often the key to answering questions
wholesale rather than retail, partly because generalizing explanations are often deeper; but there
are many non-general issues, concrete and individual questions, that rightly occupy natural
scientists. Where exactly do the fault lines run in Southern California? What is the relation among
the various hominid species? By the same token, there are many specific questions that occupy
historians and anthropologists.

Some of these questions are causal, about the factors that generated large events or that sustain
particular social systems. Yet there are others that should be emphasized. When Emmanuel Le
Roy Ladurie writes a study of a medieval population on the Franco-Spanish border, or when Jean
L. Briggs reports on family life among the Inuit, these scholars are not primarily interested in
tracing the causes of events. History and ethnography are used instead to show the readers what
it is like to live in a particular way, to provide those of us who belong to very different societies
with a vantage point from which to think about ourselves and our own arrangements. Their
purpose, to borrow an old concept, is a kind of understanding that derives from imaginative
identification.

Although studies such as these make no pretense at generality, their impact can be very large.
They can unsettle the categories that are taken for granted in all kinds of decisions, from
mundane reflections about how to respond to other people to large matters of social policy.
“Collateral damage,” for example, comes to seem an inappropriate way to talk about the victims of
the Dresden fire-bombing. Humanistic studies can also challenge the categories used to frame
lines of scientific inquiry. History and anthropology are sites at which new concepts are forged.
Their deliverances can do what Thomas Kuhn memorably claimed for the study of the history of
the sciences: they can change the images by which we are held. The Bush administration tacitly
concurred with Kuhn’s view when, at a time of shrinking budgets for the arts and humanities, it
launched an initiative to support historical studies of iconic American figures and achievements.
One effect of history (the verdicts of which Bush aimed to counter) may be a rethinking of social
institutions. (I should add that neither in the natural sciences nor in human inquiry should one
conclude that the applications tell the whole story of significance: comprehending something for
its own sake also counts.)

Once the intertwining of human inquiry with social change has been recognized, it is easy to see
why history and ethnography demand constant rewriting. Returning to the same materials is
valuable when historians or anthropologists gain new evidence—like their colleagues in the
natural sciences, they are sometimes lucky in acquiring new data, and thus led to revise. Yet there
are other reasons for revisiting themes and episodes that have already been thoroughly treated.
The history of the Roman Empire needs to be rewritten because the changes in our own society
make new aspects of the past pertinent. Older histories may have played a useful role in
generating styles of social thought that we take for granted, but in the light of our newer
conceptions contemporary historians may view different questions as significant. This may leave
the impression of an enterprise in which nothing ever accumulates, but the impression is
incorrect. If Gibbon has been in many respects superseded, we should be nonetheless grateful for
the impact that his monumental history made on his many readers. Historians return to Gibbon
because his words are not ours—it would be odd to speak as he does of the “licentiousness,”
“prostitutions,” and “chastity” of the empress Theodora. If our questions are different, it is
because we live in a very different culture, one that his history helped to bring about.

The domain of the social sciences is the territory on which humanists and natural scientists
frequently join battle. (It is also contested territory independently of this opposition: economists
and sociologists offer competing accounts of the same phenomena, and of course there are
conflicts within economics itself.) If the intertwining of history and ethnography with social
thought and social change is appreciated, then the vehemence directed against natural scientific
incursions into studies of human psychology and social behavior is more readily comprehensible.
Since the conclusions reached in analyses of human behavior will be socially consequential—they
may result in actual policies—the evidence for them deserves to be closely scrutinized, just as
drugs that may have far-reaching side effects are subjected to the most careful testing.

To declare that there is a “natural unemployment rate” of 6 percent has a wide-ranging social and
political impact, and it is entirely reasonable for critics to examine the evidence alleged to support
such a declaration. Likewise, the outcry against early ventures in sociobiology was fueled by the
perception that, while the claims advanced were sweeping (and sometimes threatening to the
aspirations of large groups of people), the support for them was markedly less strong than that
routinely demanded for theorizing about, say, insect sociality. In recent decades, proponents of
Darwinian approaches to human behavior have been far more methodologically reflective, but the
old war cries continue to echo in the contemporary context.

Human social behavior arises, in a complex social context, from the psychological dispositions of
individuals. Those psychological dispositions are themselves shaped not only by underlying

genotypes, but also by the social and cultural environments in which people develop. Cultural
transmission occurs in many animal species, but never to the extent or to the degree to which it is
found in Homo sapiens. Human culture, moreover, is not obviously reducible to a complex system
of processes in which single individuals affect others. Rigorous mathematical studies of gene-
cultural coevolution reveal that when natural selection combines with cultural transmission, the
outcomes reached may differ from those that would have been produced by natural selection
acting alone, and that the cultural processes involved can be sustained under natural selection.
Whether this happens in a wide variety of areas of human culture and domains or is relatively
rare is something nobody can yet determine. But culture appears to be at some level autonomous
and in some sense irreducible, and this is what scientism cannot grasp.

Let us imagine a model—call it the Big Model—of the causes of human social behavior. Natural
selection would be a part of the story, but only a part. Work showing that particular social
practices enhance reproductive success makes a welcome contribution, but more is needed if one
is to draw conclusions about human practices as Darwinian adaptations. For if natural selection is
at work, there must be a genetic basis for the psychological capacities and dispositions that
underlie those practices. Moreover, the dispositions and the capacities are generated from the
genetic basis in a particular environment or a range of environments, and it is important to
understand how, in the pertinent environments, those genes give rise to other traits that might
have quite different effects on reproductive success. And if genes, environments, and
psychological characteristics are thoroughly entangled, it will be wrong to focus on forms of
behavior that simply strike the investigator as potential adaptations. Moreover, the case analyzed
must be one of those in which cultural transmission does not divert the outcome from that
predicted by natural selection. (Even this is, strictly speaking, not enough, because it leaves
unquestioned the assumption that the contribution of culture can be reduced to episodes in which
items are transmitted from one individual to another.) All these lacks are difficult to make up in
practice, because—as Darwin often lamented—“profound is our ignorance.”

Beautiful and rigorous work in human behavioral ecology has been done, specifically in the
construction of precise models and the measurement of contributions to reproductive success.
But there is a tendency to inflate its conclusions, to think that the Darwinian part of the story is
now in place and “cultural influences” can be left to mop up the rest. To draw such a conclusion is
to overlook the fact that claims about adaptation presuppose a genetic basis for the relevant forms
of behavior, and to assume that the case in question is one of those in which cultural transmission
makes no difference. In effect, human behavioral ecologists simplify the Big Model in certain
ways, and there is no advance reason to hold that what they have retained is more significant than
what they have left out. Scientific analysis in this area would do better to proceed more

symmetrically, to consider a variety of ways of simplifying an intractable Big Model, and to
postpone firm conclusions about the operation of natural selection until a diversity of approaches
has been explored. Welcome the precise studies, but propose conclusions tentatively. The political
ramifications of conclusions about human beings only reinforce the demand for modesty.

For a variety of reasons, then, human inquiry needs a synthesis, in which history and
anthropology and literature and art play their parts. But there is still a deeper reason for the
enduring importance of the humanities. Many scientists and commentators on science have been
led to view the sciences as a value-free zone, and it is easy to understand why. When the
researcher enters the lab, many features of the social world seem to have been left behind. The
day’s work goes on without the need for confronting large questions about how human lives can
or should go. Research is insulated because the lab is a purpose-built place, within which the
rules of operation are relatively clear and well-known. Yet on a broader view, which explores the
purposes and their origins, it becomes clear that judgments of the significance of particular
questions profoundly affect the work done and the environments in which it is done. Behind the
complex and often strikingly successful practices of contemporary science stands a history of
selecting specific aspects of the world for investigation. Bits of nature do not shout out “Examine
me!” Throughout history, instead, innovative scientists have built a number of lampposts under
which their successors can look. It is always worth considering whether the questions that now
seem most significant demand looking elsewhere for new sources of illumination.

We are finite beings, and so our investigations have to be selective, and the broadest frameworks
of today’s science reflect the selections of the past. What we discover depends on the questions
taken to be significant, and the selection of those questions, as well as the decision of which
factors to set aside in seeking answers to them, presupposes judgments about what is valuable.
Those are not only, or mainly, scientific judgments. In their turn, new discoveries modify the
landscape in which further investigations will take place, and because what we learn affects how
evidence is assessed, discovery shapes the evolution of our standards of evidence. Judgments of
value thus pervade the environment in which scientific work is done. If they are made, as they
should be, in light of the broadest and deepest reflections on human life and its possibilities, then
good science depends on contributions from the humanities and the arts. Perhaps there is even a
place for philosophy.

Healthy relationships between the sciences and the humanities should aspire to the condition of
the best marriages—to a partnership in which different strengths and styles are acknowledged
and appreciated, in which a fruitful division of labor constantly evolves, in which constructive

criticism is given and received, in which neither party can ever make a plausible claim to absolute
authority, and in which the ultimate goal is nothing less than the furtherance of the human good.

Philip Kitcher is John Dewey Professor of Philosophy at Columbia University and the author, most
recently, of The Ethical Project (Harvard University Press). This article appeared in the May 24,
2012 issue of the magazine.