Studies show that a decline in the death rate in some countries that accompanied a decline in infectious diseases, had little to do with medical technology and more to do with increasing resistance to diseases because of improved nutrition. A study of malaria deaths in India in the early-20th century similarly links starvation and immunity. But despite the prevalence of acute and chronic hunger in India, hunger has fallen off the public health map

A medical doctor, Thomas McKeown, less well-known than he ought to be among public health workers and historians, offered us startling new insights into the advances in human longevity and health1. Surveying England and Wales, he observed that significant and long-term declines in the death rate, starting in the 18th century, had occurred due to the decline in infectious and communicable diseases—which disproportionately affect the poor. More remarkably, by plotting the point on the declining curve at which effective medical technology became available, he concluded that medical technology had little to do with this decline of mortality, with the possible exception of small pox.

Tuberculosis is a striking example. By the time the tubercle bacillus was identified by Thomas Koch, bacteriologist and one of the architects of the germ theory in the 1860s, the death toll due to tuberculosis had shown a long-term secular decline. By the time effective chemotherapy was discovered in the 1940s, tuberculosis had ceased to be a major public health problem in the West. The important role of chemotherapy in the control of tuberculosis cannot be denied, but this highlights the importance of other factors all too frequently forgotten in the hubris of medical technology.

The decline in death rates was related, besides tuberculosis, to almost all the major infectious diseases including bronchitis, pneumonia, whooping cough and measles. The decline in infectious diseases was unlikely to be related to changes in the virulence of the infectious agents over so short a period of time. Nor could it be attributed to salubrious changes in the environment, which had deteriorated due to industrialisation and urbanisation. Excluding these possible causes, McKeown concluded that this dramatic decline could only have been a consequence of increased general resistance to infectious diseases because of improvements in the nutritional status of the population due to wide-ranging changes in the agrarian economy.

McKeown acknowledged that the public health revolution of the late-19th century played an important role in reducing exposure to water-borne diseases such as diarrhoea, dysentery and cholera. However, at the most, these could account for a quarter to a third of the decline in mortality. Even in the case of this group of diseases, the underlying cause of the decline of lethality may well have been the same: increasing human resistance due to improvements in nutrition. Related to this in the latter part of the 19th century was an increase in the real wages of the order of 66%.2

This thesis has been a matter of some controversy. But it has been strengthened by a number of other studies, which note that other countries in the West had a similar health trajectory as living standards improved. The McKinleys reveal that modern medicine, both preventive and curative, accounted for a minor proportion of the mortality decline from infectious diseases in the United States.3 Preventive health measures were largely undertaken in urban areas, but the decline in the death rates extended to the rural areas as well. Data from a number of European countries over this period indicate that there were increases in mean heights, along with a decrease in class differentials in heights, both attesting to improvements in the nutritional status of the population.4 Fogel concluded that improvements in the nutritional status, as indicated by stature and body mass index, accounted for a substantial proportion of the decline in mortality in England, France and Sweden between 1775 and 1875.5

When we turn to India or other colonised countries, there is an almost complete Malthusian consensus that over-population was the cause of both poverty and diseases. The 19th century experience in England of rising incomes associated with an increasing population is forgotten in this explanation, as is the colonial drain of resources from these countries. The Western experience is, it is maintained, not applicable to India, rendering the McKeownite model irrelevant. Solutions to the problems of ill health and disease are then sought only in the domain of medical technology.

Kinsley Davis, a guru of modern demographers, in his classic The Population of India and Pakistan, set the trend. He argued that the “gift of death control technologies” from the West was responsible for the decline of the death rate, commencing in the 1920s.6 He was referring of course to the role of DDT in the control of malaria. His primary argument, fuelled by Cold War concerns, was the urgent need for birth control technologies to control population growth. Perhaps picking up from Davis, the Cambridge Economic History of India, in a chapter on population, assumes that the post-1921 decline in the death rate was due to public health measures. While plague somewhat mysteriously subsided, cholera and small pox were vanquished by public health intervention.7

One significant problem with these avowals is that there is very little empirical data to substantiate them. Commencing in the 1920s, this decline in the death rate, a major proportion of which was due to a decline in deaths from malaria, preceded by at least three decades the launch of the malaria eradication programme in the 1950s. Further, over the same period, mortality due to a range of diseases, for which there were no preventive measures or specific therapies, also declined. These included diseases such as cholera and small pox.8

Sumit Guha dismisses Davis’s explanation as “certainly not applicable to India between the Wars”9. Sheila Zurbrigg’s work on hunger and epidemic mortality not only brings fresh insights challenging received wisdom, but also strengthens a McKeownite understanding of health history in India.10 Studying malaria mortality in Punjab between 1868 and 1940, Zurbrigg found the most extraordinary decline starting around 1908.

In the 41-year period between 1868 and 1908, malaria deaths were predicated upon not just rainfall, essential for malaria transmission, but soaring foodgrain prices. Malaria death rates dropped in the period between 1909 and 1941 to less than one-third of the earlier number. This drop was accompanied neither by a decline in epidemiological indices of malaria transmission, in rainfall and flooding, nor in entomological indices. More significantly, no effective preventive and therapeutic measures were widely applied. Indeed, per capita availability of quinine was so low as to make this explanation extremely implausible.

What did change after 1908 was the incidence and severity of famine or epidemic hunger. While it is undoubtedly true that under the colonial regime per capita food availability declined, what Zurbrigg’s work reveals is the critical importance of state intervention: the political exigencies which compelled the British government to haltingly, hesitantly, initiate steps to control famine. These did not reduce the prevalence of diseases or indeed even their incidence; what they did do was reduce excess deaths due to diseases induced by starvation, by lowering the lethality of diseases.

The specific measure was the abandonment of the Malthusian policy of laissez faire in favour of purposive intervention through a changed famine code. This mandated public intervention through income support by employment generation in times of dearth and price rise. These steps did little to combat chronic hunger or endemic hunger. They did nevertheless leaven the excess deaths due to acute epidemic hunger and diseases that underlay the periodic subsistence crises of the period. In epidemiological terms, what changed was the lethality of diseases in response to an altered epidemiological triad.11

It is this factor – organised public action – which lies at the heart of public health, in altering the outlay and impact of the web of factors that determine health. These include access to resources, employment, incomes, and thus food. Data on improvements in health in the 20th century in England and Wales also strengthen what is known as the McKeownite understanding of health improvement. The most marked improvement in life expectancy was in the decades of the two World Wars, despite the substantial loss of young lives (as revealed in Table I). This again was due not to advances in healthcare but to increases in employment and, above all, food rationing.

Table I: Longevity expansion in England and Wales in the 20th century: Increase in life expectancy per decade

It is important to recollect this history because hunger has fallen off the public health landscape – and indeed mindscape—in India today. The Approach Paper to the Eleventh Plan12 , for example, in its section on health, mentions not a word about the huge prevalence of hunger in the country. Not only is hunger a major public health problem by itself, it also under-grids the massive morbidity and mortality load in the country.

A fierce debate rages in the country on the levels of poverty. Without entering into the debate, it is nevertheless the case that even those arguing that there has been substantial improvement in levels of poverty over the last decade concede that close to a third of the population still lives under the poverty line and are thus unable to meet their calorie requirements.

Data from the National Nutrition Monitoring Bureau (NNMB) indicate that there has been an improvement in the prevalence of severe under-nutrition in children between 1-5 years of age, the level declining from 11.1% in 1992 to 6.9% in 1995. However, this compares to a prevalence rate of 6.2% in 1982. While this relatively modest improvement is heartening, the level of moderate under-nutrition remains substantially unchanged at 43.5% while mild under-nutrition has increased from 36.6% in 1992 to 40.6% in 1995. Overall, the proportion of children nutritionally normal has increased from 7.2% in 1992 to 8.5% in 1995. Again, this should be tempered with caution since the figure stood at 15.6% in 198213. This data needs to be placed in the context of a dramatic decline in per capita availability of cereals commencing in 1991. Data indicate that the per capita daily availability of cereals declined from 468.5 grams in 1991 to 428.8 grams in 1999; that of pulses declined from 41.1 grams to 38.6 grams14. Indeed the NNMB notes that the average calorie consumption in the population in 1995 was below the RDA.15

Data from the NFHS II however indicate higher levels of hunger than the NNMB data; they also pertain to the whole of India rather than just seven states, as is the case with the NNMB data. NFHS II reveals that almost half the children under three years of age (47%) are underweight, and a similar proportion (46%) is stunted; 18% of children below three years of age are severely undernourished, down from 20% in NFHS I. The proportion of children stunted stood at 23%.16Wasting, or acute under-nutrition, affects 16% of children under three years of age. Under-nutrition is substantially higher in rural areas than in urban areas, but even in urban areas more than a third of children are either underweight or stunted. Levels of under-nutrition are also substantially higher among dalits (underweight 53.5%, severely underweight 21.2%; stunting 51.7%; wasting 16%) and adivasis (underweight 55.9%, severely underweight 26%; stunting 52.8%; wasting 21.8%). Anaemia affects nearly three-quarters of children (74%), and 46% have moderate and 5% have severe anaemia. Anaemia affects 78.3% of children among dalits, 6.6% severely, and 79.8% among the adivasis, 6.9% severely. The proportion of children weighing less than 2.5 kg at birth stood at 24% in rural areas and 21% in urban areas.

The NFHS II data also reveal a far from satisfactory nutritional status of women in the country. Data reveals that more than a third (36%) of women in the country had a BMI (body mass index) of less than 18.5, indicative of chronic hunger or chronic energy deficiency17. The proportion of women who are poor, and thus more likely to be illiterate, with BMI less than 18.5, is 42.6%. Among dalits the proportion is 42.1%. Women in households with a low standard of living index have chronic hunger levels of 48.1%. Prevalence rates of chronic hunger in rural areas (40.6%) are almost double those in urban areas (22.6%).

The prevalence of anaemia is, not surprisingly, equally widespread. The overall prevalence rate is 52%, with 35% mildly anaemic, 15% moderately anaemic and 2% severely anaemic18. Prevalence rates of anaemia are considerably higher for rural women (54%) than urban women (46%). The prevalence rates are 60.2% among women in households with a low standard of living, and as high as 41.9% in those with a high standard of living. Among dalits the prevalence rate is 56% and among adivasis, 64.9%. The prevalence rate among non-pregnant, non-breast-feeding women is 50.4%.

While attention has been drawn to the poor health and nutritional status of women, not enough attention has been paid to that of men. Indeed, the NFHS has no data on the prevalence of hunger among men. The NNMB however notes that 49% of adult males also suffered from chronic energy deficiency in 1990.19

In short, the nutritional data unambiguously reveals the continuing high prevalence of hunger in the population. The high prevalence of chronic hunger, in addition to acute hunger, is undoubtedly the cause of the continuing high mortality and morbidity load in the country. This is doubly tragic since it not only imposes suffering and disease, it also represents a waste of the non-realised potential among these populations of the country.

Although food is so central to the health of the population, levels of hunger do not find a mention in the National Health Policy of the country. If they do find mention in health discourses, it is frequently only anaemia as a cause of high maternal morbidity and mortality. While this is indeed high and unconscionable, neglect of hunger in public health causes far more damage than is realised.

(Mohan Rao, a medical doctor specialised in public health, teaches at the Centre of Social Medicine and Community Health, Jawaharlal Nehru University, New Delhi, where he is a Professor)

David Blane (1990), “Real Wages, The Economic Cycle, and Mortality in England and Wales, 1870-1914”, International Journal of Health Services, Vol.20, No.1, pp.43-52.

J.B.McKinlay and S. M. McKinlay (1977), “The Questionable Contribution of Medical Measures to the Mortality Decline in the United States in the Twentieth Century”, Milbank Memorial Fund Quarterly, Vol.55, Summer, pp.405-428.

Government of India, Planning Commission (2006), Towards Faster and More Inclusive Growth: An Approach to the 11th Five-Year Plan, New Delhi.

National Nutrition Monitoring Bureau (1997), Twenty Five Years of NNMB 1972-1995, Hyderabad.

Government of India, Ministry of Health and Family Welfare (1999), Health Information of India 1999, New Delhi.

The average calorie consumption was 2172 kcal as against the RDA of 2425 kcal (NNMB, op cit).

International Institute of Population Sciences (2002), National Family Health Survey (NFHS 2) 1998-99, Mumbai. A comparison cannot be made with NFHS I on stunting since height was not measured in five states during NFHS I.

The Body Mass Index (BMI) is defined as the weight in kilograms divided by the height in metres.

Mild anaemia is defined as haemoglobin levels between 10-10.9 grams/dl for pregnant and 10.11.9 grams/dl for non-pregnant women; moderate anaemia as 7-9.9gms/dl and severe anaemia as less than 7.0 gms/dl. Anaemia is one of the leading underlying causes of death in the country among women, not just among the pregnant.