Infectious Laughter

David Serlin

In the days before public health campaigns had all but eradicated maladies like smallpox
and polio, the singular distinction for being the rarest disease on the planet was held by kuru.
Kuru first came to international attention in the late 1950s after D. Carleton Gajdusek, an
American virologist and pediatrician, was invited to investigate a mysterious and fatal illness
that was devastating the Fore tribe in New Guinea. Between 1957 and 1968, over 1,100 of the Fore
died from kuru, the vast majority of whom were adult females and children.1 Kuru marks
its epidemiological territory through what observers called the “laughing sickness” or “laughing
death,” a distinct set of physical effects that include hysterical laughter, dementia, bodily
spasms, and a broad, terrifying smile across the face of its hapless victim.

Kuru is usually
identified as a prototypical example of a culture-specific disease, that is, one that only emerges
in the context of a particular society and is found nowhere else on earth. After studying the
Fore’s formal customs and daily activities, Gajdusek postulated a link between those who had
contracted kuru and their participation in funeral rites, which for the Fore people included
mortuary cannibalism and the eating of human brains. Challenging the idea that kuru was hereditary,
Gajdusek argued that it was the consumption of diseased brains that was responsible for delivering
kuru’s fatal blow. Anthropologists later found that women and children had been the most likely to
contract kuru because feasting on human brains was among the few methods by which non-male members
of the gender-stratified tribal hierarchy received any protein. Gajdusek convinced the Fore elders
to discontinue the tribe’s cannibal practices, and as a result the number of kuru deaths began to
decline. By the 1970s, fatalities from kuru had declined precipitously and the symptoms of the
“laughing death” seem to have all but disappeared.

The discovery of kuru amidst post-World
War II campaigns focused on global standards of living, like those of the Marshall Plan or the
World Health Organization, must have played like something from a horror film that revealed an
uncharted world impervious to the golden touch of modernity. For Westerners, the practice, let
alone the existence, of cannibalism is unsettling, though it also entices us to reflect upon our
insatiable curiosity about the rituals of so-called primitive peoples. Eating animal organs known
as “sweetbreads,” such as the pancreas and thymus of a calf or lamb, holds high status in some
gastronomic circles. But eating human brains—or any part of any human, for that matter—has never
fully been engaged by the
aesthetics of taste, even among gourmets, and continues to provoke
visceral repulsion and disgust. Even a B-grade horror film like Attack of the Crab
Monsters (1957), typically interpreted as a generic Cold War allegory about communist
invasion, capitalized on the public’s fascination with the arrival of cannibalistic marauders who
served as potent meta
phors for the invasion of the individual body and the body politic. In the
film, aliens materialize as voracious crabs that eat human brains and absorb the voices and
memories of their victims. A decade later, George A. Romero’s cult classic Night of the Living
Dead (1968) allegorized the specter of kuru more closely by depicting a band of virtually
unstoppable zombies who banquet upon human brains and other body parts. Romero’s film, and the wave
of 1970s horror films that followed it, reveled in the gory excrescences of blood feasts and serial
killings, luring audiences to imagine that human cannibals were a far more likely phenomenon than
brain-eating alien invaders. Staking their claim at the intersection of psychologically-driven
horror and Grand
Guignol-inspired humor, such films suggest that our revulsion toward and
fascination with eating brains not only excavates a deep-seated social taboo but, like all taboos,
perpetually gives license to the mind to wander into uncharted territory.

As a kid growing
up in the 1970s, I was both repelled and consumed by the existence of kuru, as well as by the
possibility that it might turn me, my family, and my closest friends into hysterical cannibals. The
rarity of the disease only served to convince me of the inescapable likelihood of its
transmissibility. I remember reading about kuru in a copy of our family encyclopedia and staring
for hours at the grainy black-and-white image of a laughing Fore tribesman. The colonial dimensions
of the photograph notwithstanding, the image of a laughing, brain-eating cannibal seemed to me
incompatible with the gravity of the disease, since laughter itself seemed incompatible with
cannibalism. For me, the true horror implicit in the photograph derived not simply from the
prospect that this human actually ate human brains—a seemingly inhuman act—but that such savagery
resulted in peals of echoing, maniacal laughter. Here was laughter that was neither silly nor
charming nor charismatic—what my young mind understood to be the link between the sensation of
laughing and the experience of happiness. Instead, here was a visual depiction of a sensation I
feared even as I tried desperately to imagine it.

La Vache Qui Rit, launched in 1921, was the world’s first branded cheese.

Many years later, I learned that the kuru
victim depicted in that photograph was not in fact laughing and, indeed, the idea that kuru
produces “laughing sickness” was something of an oversimplification focused on the most telltale
effect of the disease. Gajdusek’s findings on kuru in the 1960s, for which he won the Nobel Prize
in 1976, laid the groundwork for Stanley B. Prusiner, a viral neurologist who discovered the
existence of prions, proteins capable of passing disease from the brain of one organism to
that of another. For Prusiner, who won the Nobel Prize in 1997, prions were instrumental in
understanding the molecular basis of Creutzfeldt-Jakob disease (CJD), the human
counterpart to bovine spongiform encephalopathy (BSE), known colloquially as “mad cow”
disease. One of the common elements found in diseases like BSE, CJD, and
kuru is that they all deprive their victims of muscle control and induce what might be perceived as
manic behavior—hence the so-called madness at the core of “mad cow.” During the initial period of
the disease, the body submits to uncontrollable physical spasms and audible outbursts; after a
short time, it succumbs to complete passivity until all physical movement comes to a standstill.

What I believed the image of a Fore tribesman told me about the maniacal laughter of
flesh-eating cannibals was so utterly different from what the image actually depicted that I have
come to regard kuru as something of a modern object lesson. The allegedly hysterical laughter
attributed to kuru is not a subjective, personality-driven reaction to the joys of cannibalism, as
I had imagined it. It is an involuntary neurodegenerative reaction resulting from the collapse of
all physical constraints. Laughter, the essential charm of which derives from its evanescence,
becomes with the onset of kuru an eerily empty signifier unmoored from any recognizable system of
meaning except, perhaps, for signaling one’s imminent mortality. Similarly, the broad smile that
appears on the victim’s face is not a deliberate expression of pleasure or contentment. It is,
instead, the smile’s terrifying opposite: the confirmation of the complete collapse of one’s
neurophysiologic system. The smile brought on by kuru is the inscrutable grimace of a face
seemingly stripped of legibility.

According to recent health statistics, smallpox has
replaced kuru as the rarest disease on the planet. Since the mid-1960s, however, a number of
neurodegenerative diseases related to kuru have been reportedly on the rise: not among
brain-feasting cannibals but among members of the civilized West. During the period in which kuru
was virtually eradicated from the Fore tribe, for example, many women and children—the same
demographic groups that contracted kuru—in the US, Europe, and parts of Asia and
Africa contracted CJD as a result of injections of growth hormones and fertility drugs
distilled from pituitary glands harvested from BSE-infected animals.2
Before the mad cow scares of the mid-1990s, cattle and sheep were regularly pulverized and used as
food sources for other cattle and sheep, which made it possible for the disease to pass to healthy
animals and humans through the food chain. According to one apocalyptic researcher, infected meat
will leave half the British population brain-dead by CJD by the middle of the
century.3 Meanwhile, cattle farmers from Britain and the European Union have been
exporting beef infected with BSE into the free market even while banning the
consumption of tainted beef in their own countries, contributing to the appearance of
CJD in India and parts of Asia. Clearly, the exoticism that we once attributed to the
cannibalistic rituals of primitive tribes has come home to us through the widespread practices of
industrial farming and modern pharmaceuticals, thereby recalibrating the cycle of taboo according
to the politics of feast or famine. With the specter of “mad cow” on the horizon, it has become
virtually impossible for me to eat those triangle-shaped wedges of soft processed cheese imported
from France known as La vache qui rit [“the laughing cow”] with anything approaching the
ironic gusto with which I enjoyed it only a few years ago. In the era of BSE, the
image of a cow laughing, no matter how stylized or nostalgic, reminds me too much of the
provocative misalignment between signifier and signified.

When is laughter a sign of true
happiness, and when it is a sign of sickness? When is laughter something else altogether? Just as
kuru is discussed as an essentially culture-specific disease, our understanding of smiling and
laughter is just as specific, shaped as much by culture and geography as it is by the relativistic
assumptions that societies attribute to them. During his time researching kuru in New Guinea in the
1950s and 1960s, D. Carleton Gajdusek posed for photographs among grateful members of the Fore
tribe. In image after image, Gajdusek is surrounded by smiling and laughing adolescent boys, most
of whom his research had saved from an almost certain death. But like the encyclopedia’s image of
the hysterical cannibal, these photographs of Gajdusek, read in retrospect, assume a sinister
dimension that challenges our expectations of what laughter is supposed to signify. In the
mid-1990s, at precisely the same time that “mad cow” disease was making international headlines,
Gajdusek pleaded guilty to charges that he had engaged in sexual relations with several of the Fore
boys whom he had formally adopted and to whom he had given his own surname. In 1998, after a brief
period of incarceration at a Frederick, Maryland, penitentiary, a federal judge offered him the
opportunity to leave the United States, never to return again. Gajdusek chose to settle in Paris, a
safe haven for serious gourmets and social outcasts alike, and home to the corporate headquarters
for Bel, the manufacturer of the cheese marked by the sign of the laughing cow. If it is possible
to accept that laughter is a medium capable of both illumination and illusion, of revealing inner
joy and signaling hidden suffering, how will we ever be able to tell the
difference?

David Serlin is an assistant professor of communication and science studies
at the University of California, San Diego, and an editor-at-large for Cabinet.
He is the author of Replaceable You: Engineering the Body in Postwar America
(University of Chicago Press, 2004).