Is the world becoming sicker or are we just better able to detect disease? The last decades have seen dramatic improvements in biological disease detection with dozens of new potential pathogens anticipated by 2020. At the same time innovations in information management are increasing awareness of disease outbreaks. Perry et al. (2011)1 explore this in a recent review and conclude that there is overall evidence for increased emergence of disease in recent decades, and not just improvements in diagnosis and surveillance. The current increase in disease emergence is not historically unprecedented: major epidemiological transitions also occurred during the Neolithic when livestock were domesticated on a wide-scale, during the age of exploration when Old World pathogens were introduced to the New World, and to a lesser extent with increased global travel in the nineteenth century).

Perry, B.D., Grace, D. Sones, K., 2011. Current drivers and future directions of global livestock disease dynamics. Proceedings of the National Academy of Sciences of the United States of America, 16 May 2011. doi 10.1073/pnas.1012953108 [↩]

The horse plague — sketches about town during the epidemic, by Theodore R. Davis. From Harper’s weekly : a journal of civilization. 1872

On the evening of October 21st only a few animals were affected, but on the morning of the 22d there was scarcely an animal of the equine species that was not affected. Horses, mules, and even a zebra. More than twenty thousand were suffering in different degrees. 1

“A terrified man realizing he has just contracted the plague, surrounded by a group of people.”
By E.M. Ward, 1848.

Even the most lethal pathogens we know of don’t kill every single infected individual.1. Sometimes this is because the pathogen that infects the person is relatively weak. Sometimes it’s because the dose was low. And sometimes it’s because of something intrinsic to the patient. Some people are genetically resistant to HIV, because they have a mutated receptor, for example.

The opposite is also true. Sometimes people are more intrinsically susceptible to a pathogen. That became terribly clear during the AIDS epidemic, when quite innocuous agents started killing people, but there are probably many, many natural genetic variants that make us susceptible to some pathogens, just as some make us resistant. When epidemiologists look for “risk factors” that increase mortality or disease severity, this is part of the information they’re trying to tease out, in a rather crude way Sorting this out is part of the goal of the whole personalized medicine movement.

A fascinating example was just documented in MMWR. 2 Here a researcher was working with a genetically modified form of Black Plague bacteria (Yersinia pestis). This bacteria should have been harmless, because it had had its ability to grab iron from the host removed. 3. But the researcher became infected, and died, of an infection with the weakened strain.

We now learn that this was probably because the researcher had his own genetic mutation, hereditary hematochromatosis, which leads to increased levels of iron in the blood. He may4 have been uniquely susceptible to this strain,5 which could only infect people who conveniently made extra iron available to it:

Conceivably, hemochromatosis-induced iron overload might have a similar effect, enhancing the virulence of the infecting KIM D27 strain by compensating for its iron-acquisition defects6

Patients and pathogens are ecosystems; you need to understand both of them, or you don’t understand either.

Even rabies virus, for example, which kills well over 99.999% of the people it infects, has had a half-dozen people survive. Myxomatosis virus let a few rabbits survive, and their progeny became relatively resistant; there are a handful of long-term survivors of HIV treatment; and when we get down to things like ebola and smallpox, 10-30% of infected people survive.[↩]

The quest for iron is a constant struggle for pathogenic (and other) bacteria, and they have evolved all kinds of mechanisms to seize it from the host, while at the same time animals have evolved more and more ways to keep iron away from invading bacteria.[↩]

Then, it would seem to have been all but unanimous; and now, one would think, at first sight, that it were almost an insult to human understanding to be obliged to collect statistics to prove that vaccination confers a large exemption from attacks of small pox, and almost absolute security against death from that disease. … The general ignorance of the community, especially of the lower orders, as to the aim and object of vaccination, is lamentably great, and has still to be overcome.

As I’ve noted several times before, regulatory T cells are important reasons for the poor immune response to tumors. TRegs are normal components of an immune response, “designed” to keep inflammation from running riot in general and to prevent responses to self-antigens in particular. Whether it’s because tumors are mostly (though not solely) self antigens, because tumors are chronic sources of stimulation that could lead to inflammation running riot, or because tumors “learn” how to specifically trigger TReg-like responses, TRegs are common features of tumors.

Eliminating TRegs, in mouse models of cancer, often allows a strong immune response to the tumor. An interesting spin on this was shown in a recent J Immunol paper.1 It seems that the TRegs don’t generally suppress all the response, they shut down the responses to some targets harder than others:

Our results indicate, therefore, that depletion of Tregs uncovers cryptic responses to Ags that are shared among different tumor cell lines. CT26-specific T cell responses can be elicited by different forms of vaccination in the presence of regulatory cells, but in these cases T cell responses are highly focused on a single tumor-specific epitope …Taken together, these data suggest that immune responses to some Ags are more tightly regulated than others. 1

In other words, even though you might be able to force a protective immune response to a tumor by vaccinating in the presence of TRegs, when you get rid of TRegs the response is broader, and targets T cell epitopes that otherwise wouldn’t look like they’re epitopes at all.

I wonder if this goes on with “normal” (say, viral or other non-tumor) epitopes – whether this sort of thing might help explain some forms of immunodominance. I kind of doubt it, but the phenomenon does sounds a little like revealing a subdominant response.

I wonder also how this ties in with a recent paper that suggested TRegs in tumors are highly focused on a small subset of tumor epitopes. Could they be more broadly-based, but on epitopes that are otherwise invisible? Again, I kind of doubt it, but it’s an intriguing idea. Maybe the universe of tumor epitopes available for attack is much larger than we realize.

(We are in the process of selling one home and buying another, while at work I just finished organizing a course on biosecurity for an international group. In the upcoming week I’m traveling to a conference in Washington. To say nothing of the Thanksgiving holiday. All this means short and scarce updates for a little while.)

We know that the immune response to HIV forces the virus to evolve at great speeds, so that the viral targets of the immune response change and become at least temporarily invisible. We also know that the specific targets are different for almost every infected person. So although you have rapid evolution in each individual, what does this mean to overall evolution of the global population of HIV?

The several dozen CTL epitopes we survey from HIV-1 gag, RT and nef reveal a relatively sedate rate of evolution with average rates of escape measured in years and reversion in decades. For many epitopes in HIV, occasional rapid within-host evolution is not reflected in fast evolution at the population level.1

(My emphasis) This is a modeling study (though it did look at real-life data to some extent), but their conclusion is consistent with larger-scale population studies as well; see my previous post here and links therein.

Rinderpest is the most fatal disease affecting cattle. … The first great epizootic of which there seems to be records occurred about 1709 and spread over nearly all of the countries of Europe. It is reported that 1,500,000 cattle died from its effects during the years from 1711 to 1714. 2

1839 (via 1861):

Previously to the present century the only well recognized epizootics that are known to have prevailed extensively among horned cattle in Europe were the Eczema Epizootica, or “mouth and foot disease”, a complaint well known in England since the year 1839, and the terrible Rinderpest or Steppe murrain.

This last named disease, which is described as being of the nature of a highly infections typhus fever, terminating in dysentary, is said to be indigenous to the Steppes of Tartary and Siberia, from whence it has descended, from time to time, upon Russia, Germany, and other European countries.

It has been estimated that during the eighteenth century the Rinderpest destroyed, in Europe, as many as two hundred millions of cattle.3

1865 (via 1880) :

In 1865 the plague appeared in Holland, and was carried thence to England. In both countries the disease carried off one hundred thousand head of cattle in the course of a few months.4

1889 (via 1909):

About the year 1889, or shortly before, a virulent form of rinderpest started among the domestic cattle and wild buffalo almost at the northern border of the buffalo’s range, and within the next few years worked gradually southward to beyond the Zambesi. It wrought dreadful havoc among the cattle and in consequence decimated by starvation many of the cattle-owning tribes; it killed many of the large bovine antelopes, and it wellnigh exterminated the buffalo.5

2010:

14 October 2010, Rome – An ambitious global effort that has brought rinderpest, a deadly cattle plague, to the brink of extinction is ending all field activities, paving the way for official eradication of the disease.6