Book review: An Epidemic of Absence takes on the worms you're missing.

Share this story

An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases could be co-marketed with the Thomas Rockwell’s children’s classic How to Eat Fried Worms. It begins with the author, Moises Velasquez-Manoff, recounting his border-crossing to Tijuana to infect himself with Necator americanus—hookworms—in an attempt to cure the asthma, hay fever, food allergies, and alopecia that had plagued him since childhood. In the next three hundred pages, the author very cogently explains the idea that led him to willingly infect himself with a parasite known to cause severe diarrhea, anemia, and mental retardation in children.

Velasquez-Manoff marshals the reams of evidence researchers have accumulated to support said concept: the hygiene hypothesis, but with an updated, parasitic twist. The ideas he presents haven't been accepted by many in the medical community, and there's little high-quality evidence, in the form of well controlled trials, that exposure to parasites could have positive effects on human health. So, even if the author is thorough, it's important to keep in mind that the evidence he's presenting is primarily in the form of correlations.

Vanishing microbes, a rising tide of allergies

Since children's author Rockwell penned that first worm eating guide in 1973, the incidence of allergies and autoimmune diseases—both disorders in which the immune system attacks things that it should know are harmless—have skyrocketed in the developed world. Those who read that book as children have watched it happen; whereas a PB&J was the standard brown bag lunch for us, our children’s schools are now all nut free.

Adherents of the paleo diet maintain that the trouble started with the advent of agriculture in the Neolithic Revolution, about 12,000 years ago, and has been getting progressively worse. They argue that humans have adapted to eat only foods that could be hunted or gathered, and the recent preponderance of allergies demonstrates that we have not yet evolved to eat food that must be cultivated, like wheat and legumes.

Velasquez-Manoff cites examples of contemporary hunter-gatherer societies whose members are, in fact, much healthier than their age-matched counterparts in the developed world. Yet evolutionary geneticists have an alternate explanation for the dramatic upswing in immune disorders. Genes that cause disease but are common throughout a population, the thinking goes, must confer some benefit or they would have been selected against. And the genetic variants that predispose modern humans to immune disorders are distressingly common. They are also found in genes that are present across a wide variety of species, indicating that they are quite ancient—further evidence that they probably have an important function. Research on different immune diseases all over the world suggests that these genes are involved in defense against pathogens.

For almost all of our evolutionary development humans were pretty much covered in bacteria, viruses, and parasitic worms. Velasquez-Manoff refers to this group as "old friends." These are different from the disease-causing bugs we vaccinate against; we got those much later in our evolutionary development, from the animals we domesticated, and Velasquez-Manoff is clear in his insistence that vaccinating against them is necessary and good.

But finding a way to achieve some sort of truce with "old friends" was the immune system’s essential and constant job throughout our co-evolution. It is only after the sanitary reforms of the early nineteenth century that we suddenly find ourselves in an environment relatively purged of microbes, and these protective genes may be a liability in that environment, rather than an asset.

The Hygiene Hypothesis

A simplistic view of the hygiene hypothesis is that in the absence of something dangerous to fight against—the cholera toxin, for example—immune cells get confused, or bored, and fight against harmless stimuli like dust mites and peanuts instead. But there is a more nuanced view. Our immune systems co-evolved with an enormous community of microbes, and were in fact shaped by them. Many became established, long-term, and vital residents in our guts; the importance, and in fact the very existence, of these commensals has only recently been realized.

Constant exposure to all of these bugs, as a unit, enhanced the regulatory arm of the immune system, modulating responses so that we could tolerate the filthy environment in which we lived while at the same time (hopefully) fighting off those pathogens that posed a mortal threat and not destroying our own bodies in that process. In the martial analogy that is inevitable in discussing immunology, ancient human immune cells that were always surrounded by microbes were like battle-hardened old soldiers who have learned the ability to watch warily when encountering something new, waiting to see whether or not it is dangerous; modern immune cells raised in our hyper-sanitized environment are like new recruits just given their first gun, testy and jumpy at the first hint of a threat and liable to blow up their surroundings in inappropriately directed and outsized force. Experience has not taught them moderation.

On the molecular level, immune cells in the dirty old days made more anti-inflammatory signaling molecules; now, our cells make predominantly pro-inflammatory signals.

Autoimmune diseases are currently thought to arise from an interplay of genetic and environmental factors, notably stress. Some have argued that this means genes are everything, because what modern human doesn’t have stress in his life? Only those genetically primed go on to develop disease.

But Velasquez-Manoff takes us to Sardinia to upend this argument. Sardinians are an isolated, inbred group, and they have experienced a twin epidemic of multiple sclerosis and type I diabetes, both autoimmune diseases, in the past sixty years—ever since they got rid of malaria. For the past few thousand years, those Sardinians that were genetically resistant to the malaria parasite survived; those that were not did not. The relentless presence of malaria in their environment shaped their genomes.

And then when malaria was suddenly removed, its lack may have allowed the immune system’s underlying protective feature to go into overdrive. A similar, if less dramatic, trajectory of events could explain how the removal of most of our "old friends," but especially the worms, uncovered underlying genetic tendencies that only yield autoimmune and allergic disorders in our modern context.

The Helicobacter pylori story further underscores the importance of context. H. pylori definitely, without a doubt, causes ulcers. And stomach cancer. Yet it protects against heartburn, esophageal cancer, asthma, and eczema.

H. pylori has been with humanity since before we left Africa. Why would it make only some people who harbor it sick, and why so much more so in the last few hundred years? Our increased life expectancy doesn’t account for it. In days of yore, when humans routinely encountered H. pylori early in childhood, the bug taught their immune cells tolerance and protected against asthma. Now, since we grow up in cleaner environments, we encounter it later. Not only does its early absence predispose to asthma, its late introduction induces ulcers. But the effects of H. pylori infection are dictated not only by when it is introduced to the human gut, but by the other microbes it does or does not encounter there.

Parasitic worms also seem to be significant regulators of the immune system, able to elicit just the right balance of ferocity and temperance. Deworming campaigns the world over are promptly and predictably followed by increases in asthma and allergy, and the degree of allergy in a society is inversely proportional to how wormy and dirty it is. Hence, people suffering from allergies and autoimmune diseases are now infecting themselves with hookworms, which—on an anecdotal level at least—has alleviated maladies ranging from MS to autism to celiac disease.

Seeing worms everywhere

Yes, he includes autism in the list of modern diseases caused by our out-of-whack immune systems. Along with other cases where immune dysfunction hasn't been established, like obesity, cardiovascular disease, type 2 diabetes, and cancer.

There are some serious problems with blaming all of these on immune dysfunction, but we'll focus on a single example: autism. Just as the absence of worms’ mediating effects on our immune system causes some people to have an allergic response to harmless ingested proteins and others to attack their own tissues, the argument goes, chronic inflammation in the womb generates fetuses with autism.

Velasquez-Manoff cites circumstantial evidence supporting this idea—autism follows the same demographics as asthma, occurring primarily in firstborns, males, and urban centers in wealthy countries, and one of the risk factors for autism is a mother with an autoimmune disease. But this demographic data is obtained by questionnaire, often with quite a small sample size, and is thus inherently suspect.

As researchers continue to delve into the cause of autism, the data they are accumulating indicates that it is a genetic and not an immune disease. Environmental factors are almost certainly involved, and it is a complex genetic disease—mutations in many different genes, possibly hundreds of them, can cause it. Most cases of autism are sporadic, meaning that only one individual in a family is affected.

But rigorous experiments have shown that even sporadic cases can generally be traced back to spontaneous genetic mutations in the developing fetus. The mutated genes are involved in forming and maintaining the gross architecture of the brain, lending credence to the idea that autism and its accompanying spectrum of disorders arises from a lack of connectivity among neurons. Few of the mutated genes associated with autism seem to be involved in immunity.

When we discussed his book, Velasquez-Manoff did suggest there were limits to how well we should treat our old friends. Deworming campaigns, he said, are still necessary, since the world's poor children are the ones who bear the brunt of the worms' negative effects. Their parasites exacerbate their malnutrition and cause them to miss a lot of school, promoting a cycle of poverty.

The sanitary reforms of the mid-nineteenth century, along with the germ theory of disease and the vaccines and antibiotics it precipitated, were undoubtedly an enormous medical coup that largely eliminated the infectious diseases that had formerly killed a quarter of the population by age one. We no longer live in fear of the Black Death or similar medieval scourges that killed millions.

But along with these breakthroughs came the idea that all microbes are all bad, which yielded needlessly antibiotic soap and sanitary covers for toddlers sitting in shopping carts. The backlash, that we need to get to know and love the microbiota inhabiting our guts, is yielding Brooklyn hipsters who brew their own kombucha and the very unfortunate anti-vaccine movement. This book argues that microbes are neither good nor bad, but can be either or both depending on the context in which we encounter them. And the real cause of the allergy and autoimmune epidemic is that we have severely screwed up that context, both inside our guts and outside in the rest of the environment.

These ideas are still well outside the medical mainstream and, in several cases, the authors intense focus on immune disorders had led him to get carried away. But Velasquez-Manoff has put together a well argued case that, for at least some disorders, our interactions with microbes should at least be given serious consideration. And, despite the copious research involved, the book remains very readable.

And since you were wondering: the worms seemed to clear up his sinuses and skin, but he didn’t grow hair. And they gave him terrible, terrible diarrhea.