Malaria -- Summary
In a research report in Science, Rogers and Randolph (2000) note what is probably well known to all, namely, that "predictions of global climate change have stimulated forecasts that vector-borne diseases will spread into regions that are at present too cool for their persistence." Indeed, such predictions comprise one of the major global-warming scare-stories of the world's climate alarmists. There are, however, several problems with this scenario.

According to Reiter (2000), claims that malaria resurgence is the product of CO2-induced global warming ignore other important factors and disregard known facts. An historical analysis of malaria trends, for example, reveals that this disease was an important cause of illness and death in England during a period of colder-than-present temperatures throughout the Little Ice Age. What is more, its transmission began to decline only in the 19th century, during a warming phase, when, according to Reiter, "temperatures were already much higher than in the Little Ice Age."

We could well ask ourselves, therefore, why malaria was so prevalent in Europe during some of the coldest centuries of the past millennium .. and why we have only recently witnessed malaria's widespread decline at a time when temperatures have been warming. Clearly, there must be other factors at work that are more important than temperature. And there are -- factors such as the quality of public health services, irrigation and agricultural activities, land use practices, civil strife, natural disasters, ecological change, population change, use of insecticides, and the movement of people (Reiter, 2000; Reiter, 2001; Hay et al., 2002).

Why, then, do climate alarmists predict widespread future increases in malaria? They do it because nearly all of the analyses they cite have typically used only one, or at most two, climate variables to make predictions of the future distribution of the disease over the earth; and they generally do not include any of the non-climatic factors listed in the paragraph above. In one recent modeling study, for example, Rogers and Randolph (2000) employed a total of five climate variables and obtained very different results. Briefly, they used the present-day distribution of malaria to determine the specific climatic constraints that best define that distribution, after which the multivariate relationship they derived from this exercise was applied to future climate scenarios derived from state-of-the-art climate models, in order to map potential future geographical distributions of the disease.

The results of their study revealed very little change: a 0.84% increase in potential malaria exposure under the "medium-high" scenario of global warming and a 0.92% decrease under the "high" scenario. In consequence of these findings, Rogers and Randolph explicitly state that their quantitative model "contradicts prevailing forecasts of global malaria expansion," and that "it highlights the use [we would say superiority] of multivariate rather than univariate constraints in such applications."

Clearly, this study undercuts the climate-alarmist claim that any future warming of the globe will allow malaria to spread into current malaria-free regions, as do the studies of Hay et al. (2002) and Shanks et al. (2000). The first of these research groups investigated long-term trends in meteorological data at four East African highland sites that experienced significant increases in malaria cases over the past couple of decades, reporting that "temperature, rainfall, vapour pressure and the number of months suitable for P. falciparum transmission have not changed significantly during the past century or during the period of reported malaria resurgence." Hence, these factors could not be responsible for the observed increases in malaria cases. Likewise, Shanks et al. examined trends in temperature, precipitation and malaria rates in western Kenya over the period 1965-1997, finding absolutely no linkages among the variables.

Also working in Africa, Small et al. (2003) examined trends in a climate-driven model of malaria transmission between 1911 and 1995, using a spatially and temporally extensive gridded climate data-set to identify locations where the malaria transmission climate suitability index had changed significantly over this time interval. Then, after determining areas of change, they more closely examined the underlying climate forcing of malaria transmission suitability for those localities. This protocol revealed that malaria transmission suitability did indeed increase because of climate change in specific locations of limited extent; but in Southern Mozambique, which was the only region for which climatic suitability consistently increased, the cause of the increase was increased precipitation, not temperature. In fact, Small et al. say that "climate warming, expressed as a systematic temperature increase over the 85-year period, does not appear to be responsible for an increase in malaria suitability over any [our italics] region in Africa." Hence, they concluded that "research on the links between climate change and the recent resurgence of malaria across Africa would be best served through refinements in maps and models of precipitation patterns and through closer examination of the role of nonclimatic influences," the great significance of which has recently been demonstrated by Reiter et al. (2003) for dengue fever, another important mosquito-borne disease.

Further examining the reemergence of malaria in the East African highlands were Zhou et al. (2004), who addressed the issue via a nonlinear mixed-regression model study that focused on the numbers of monthly malaria outpatients of the past 10-20 years in seven East African highland sites and their relationships to the numbers of malaria outpatients during the previous time period, seasonality and climate variability. In doing so, they say that "for all seven study sites, we found highly significant nonlinear, synergistic effects of the interaction between rainfall and temperature on malaria incidence, indicating that the use of either temperature or rainfall alone is not sensitive enough for the detection of anomalies that are associated with malaria epidemics [our italics]," as has also been found by Githeko and Ndegwa (2001), Shanks et al. (2002) and Hay et al. (2002). What is more, climate variability -- not just temperature or not just warming -- contributed less than 20% of the temporal variance in the number of malaria outpatients, and at only two out of the seven sites studied.

In light of their findings, Zhou et al. concluded that "malaria dynamics are largely driven by autoregression and/or seasonality in these sites," and that "the observed large among-site variation in the sensitivity to climate fluctuations may be governed by complex interactions between climate and biological and social factors," including "land use, topography, P. falciparum genotypes, malaria vector species composition, availability of vector control and healthcare programs, drug resistance, and other socioeconomic factors," among which are "failure to seek treatment or delayed treatment of malaria patients, and HIV infections in the human population," which they say have "become increasingly prevalent." Hence, it would appear that the so-called unprecedented global warming of the past century or so, which is claimed by climate alarmists to have significantly accelerated over the past couple of decades, should be the least of our worries with respect to this subject ... or that the claimed acceleration of warming is more imagined than real.

Prefacing another revealing study, Kuhn et al. (2003) say "there has been much recent speculation that global warming may allow the reestablishment of malaria transmission in previously endemic areas such as Europe and the United States." In particular, they note that "the British Chief Medical Officer's recent report [Getting Ahead of the Curve: A Strategy for Combating Infectious Diseases (Including Other Aspects of Health Protection), Department of Health (2002), London] asserted that 'by 2050 the climate of the UK may be such that indigenous malaria could become re-established'," which is the same mantra that is incessantly chanted by the world's climate alarmists. Consequently, to investigate the robustness of this hypothesis, they analyzed the determinants of temporal trends in malaria deaths within England and Wales from 1840-1910.

With respect to temperature changes over the period of study, this analysis indicated that "a 1°C increase or decrease was responsible for an increase in malaria deaths of 8.3% or a decrease of 6.5%, respectively," which explains "the malaria epidemics in the 'unusually hot summers' of 1848 and 1859." Nevertheless, the long-term near-linear temporal decline in malaria deaths over the period of study, in the words of the researchers, "was probably driven by nonclimatic factors," among which they list increasing livestock populations (which tend to divert mosquito biting from humans), decreasing acreages of marsh wetlands (where mosquitoes breed), as well as "improved housing, better access to health care and medication, and improved nutrition, sanitation, and hygiene." They additionally note that the number of secondary cases arising from each primary imported case "is currently minuscule," as demonstrated by the absence of any secondary malaria cases in the UK since 1953.

Although simplistic model simulations may suggest that the increase in temperature predicted for Britain by 2050 is likely to cause an 8-14% increase in the potential for malaria transmission, Kuhn et al. say "the projected increase in proportional risk is clearly insufficient to lead to the reestablishment of endemicity." Expanding on this statement, they note that "the national health system ensures that imported malaria infections are detected and effectively treated and that gametocytes are cleared from the blood in less than a week." For Britain, therefore, they conclude that "a 15% rise in risk might have been important in the 19th century, but such a rise is now highly unlikely to lead to the reestablishment of indigenous malaria," since "socioeconomic and agricultural changes" have greatly altered the cause-and-effect relationships of the past.

In the introduction to his review about what was known to this point in time about the putative link between global warming and the spread of infectious diseases, Zell (2004) stated that many people "assume a correlation between increasing disease incidence and global warming." However, as he concluded after studying the issue in considerable depth, "the factors responsible for the emergence/reemergence of vector-borne diseases are complex and mutually influence each other," citing as an example of this complexity the fact that "the incidence and spread of parasites and arboviruses are affected by insecticide and drug resistance, deforestation, irrigation systems and dams, changes in public health policy (decreased resources of surveillance, prevention and vector control), demographic changes (population growth, migration, urbanization), and societal changes (inadequate housing conditions, water deterioration, sewage, waste management)." Therefore, as he continues, "it may be over-simplistic to attribute emergent/re-emergent diseases to climate change and sketch the menace of devastating epidemics in a warmer world," such as Al Gore does in An Inconvenient Truth. Indeed, Zell states that "variations in public health practices and lifestyle can easily [our italics] outweigh changes in disease biology," especially those that might be caused by global warming.

In a rather different type of study, but one that is extremely pertinent, Tuchman et al. (2003) (1) took leaf litter from Populus tremuloides (Michaux) trees that had been grown out-of-doors in open-bottom root boxes located within open-top aboveground chambers maintained at atmospheric CO2 concentrations of either 360 or 720 ppm for an entire growing season, (2) incubated the leaf litter for 14 days in a nearby stream, and (3) fed the incubated litter to four species of detritivorous mosquito larvae to assess its effect on their development rates and survivorship. This work revealed that larval mortality was 2.2 times higher for Aedes albopictus (Skuse) mosquitos that were fed leaf litter that had been produced in the high-CO2 chambers than it was for those fed litter that had been produced in the ambient-air chambers. In addition, they found that larval development rates of Aedes triseriatus (Say), Aedes aegypti (L.) and Armigeres subalbatus (Coquillett) were slowed by 78%, 25% and 27%, respectively, when fed litter produced in the high-CO2 as opposed to the ambient-CO2 chambers, so that mosquitoes of these species spent 20, 11 and 9 days longer in their respective larval stages when feeding on litter produced in the CO2-enriched as compared to the ambient-CO2 chambers. As for the reason behind these observations, the researchers suggest that "increases in lignin coupled with decreases in leaf nitrogen induced by elevated CO2 and subsequent lower bacterial productivity [on the leaf litter in the water] were probably responsible for [the] decreases in survivorship and/or development rate of the four species of mosquitoes."

What is the significance of these findings?

In the words of Tuchman et al., "the indirect impacts of an elevated CO2 atmosphere on mosquito larval survivorship and development time could potentially be great," because longer larval development times could result in fewer cohorts of mosquitoes surviving to adulthood; and with fewer mosquitoes around, there should be lower levels of mosquito-born diseases, for which blessing we would have the ongoing rise in the atmosphere's CO2 concentration to thank.

In another major review of the potential impacts of global warming on vector-borne diseases, Rogers and Randolph (2006) focus on recent upsurges of malaria in Africa, asking the question "Has climate change already had an impact? They go on to demonstrate that "evidence for increasing malaria in many parts of Africa is overwhelming, but the more likely causes for most of these changes to date include land-cover and land-use changes and, most importantly, drug resistance rather than any effect of climate," noting that "the recrudescence of malaria in the tea estates near Kericho, Kenya, in East Africa, where temperature has not changed significantly [our italics], shows all the signs of a disease that has escaped drug control following the evolution of chloroquine resistance by the malarial parasite." They then go on to explain that "malaria waxes and wanes to the beat of two rhythms: an annual one dominated by local, seasonal weather conditions and a ca. 3-yearly one dominated by herd immunity," noting that "effective drugs suppress both cycles before they can be expressed," but that "this produces a population which is mainly or entirely dependent on drug effectiveness, and which suffers the consequence of eventual drug failure, during which the rhythms reestablish themselves, as they appear to have done in Kericho."

Last of all, Childs et al. (2006) present a detailed analysis of malaria incidence in northern Thailand based on a quarter-century monthly time series (January 1977 through January 2002) of total malaria cases in the country's 13 northern provinces. Over this time period, when climate alarmists claim the world warmed at a rate and to a level that were unprecedented over the prior two millennia, they report there was an approximately constant rate of decline in total malaria incidence (from a mean monthly incidence in 1977 of 41.5 cases per hundred thousand people to 6.72 cases per hundred thousand people in 2001), due primarily to a reduction in cases positive for Plasmodium falciparum (mean monthly incidence in 1977 and 2001 of 28.6 and 3.22 cases per 100,000 people, respectively) and secondarily to a reduction in cases positive for P. vivax (mean monthly incidence in 1977 and 2001 of 12.8 and 3.5 cases per 100,000 people, respectively). Consequently, noting that "there has been a steady reduction through time of total malaria incidence in northern Thailand, with an average decline of 6.45% per year," they say this result "reflects changing agronomic practices and patterns of immigration, as well as the success of interventions such as vector control programs, improved availability of treatment and changing drug policies."

In light of the many findings described above, the claim that malaria will expand across the globe and intensify as a result of CO2-induced warming is seen to be basically bogus. In the words of Dye and Reiter (2000), "given adequate funding, technology, and, above all, commitment, the campaign to 'Roll Back Malaria,' spearheaded by the World Health Organization, will have halved deaths related to [malaria] by 2010," independent of whatever tack earth's climate might take in the interim.

Shanks, G.D., Biomndo, K., Hay, S.I. and Snow, R.W. 2000. Changing patterns of clinical malaria since 1965 among a tea estate population located in the Kenyan highlands. Transactions of the Royal Society of Tropical Medicine and Hygiene94: 253-255.

Zhou, G., Minakawa, N., Githeko, A.K. and Yan, G. 2004. Association between climate variability and malaria epidemics in the East African highlands. Proceedings of the National Academy of Sciences, USA101: 2375-2380.