Current practice

Public health programs

Most governments recognize the importance of public health programs in reducing the incidence of disease, disability, and the effects of aging and other physical and mental health conditions, although public health generally receives significantly less government funding compared with medicine.[5] Public health programs providing vaccinations have made strides in promoting health, including the eradication of smallpox, a disease that plagued humanity for thousands of years.

document the impact of an intervention, or track progress towards specified goals; and

monitor and clarify the epidemiology of health problems, allow priorities to be set, and inform health policy and strategies.

diagnose, investigate, and monitor health problems and health hazards of the community

Public health surveillance has led to the identification and prioritization of many public health issues facing the world today, including HIV/AIDS, diabetes, waterborne diseases, zoonotic diseases, and antibiotic resistance leading to the reemergence of infectious diseases such as tuberculosis. Antibiotic resistance, also known as drug resistance, was the theme of World Health Day 2011. Although the prioritization of pressing public health issues is important, Laurie Garrett argues that there are following consequences.[8] When foreign aid is funneled into disease-specific programs, the importance of public health in general is disregarded. This public health problem of stovepiping is thought to create a lack of funds to combat other existing diseases in a given country.

For example, the WHO reports that at least 220 million people worldwide suffer from diabetes. Its incidence is increasing rapidly, and it is projected that the number of diabetes deaths will double by the year 2030.[9] In a June 2010 editorial in the medical journal The Lancet, the authors opined that "The fact that type 2 diabetes, a largely preventable disorder, has reached epidemic proportion is a public health humiliation."[10] The risk of type 2 diabetes is closely linked with the growing problem of obesity. The WHO’s latest estimates highlighted that globally approximately 1.5 billion adults were overweight in 2008, and nearly 43 million children under the age of five were overweight in 2010.[11] The United States is the leading country with 30.6% of its population being obese. Mexico follows behind with 24.2% and the United Kingdom with 23%. Once considered a problem in high-income countries, it is now on the rise in low-income countries, especially in urban settings. Many public health programs are increasingly dedicating attention and resources to the issue of obesity, with objectives to address the underlying causes including healthy diet and physical exercise.

Some programs and policies associated with public health promotion and prevention can be controversial. One such example is programs focusing on the prevention of HIV transmission through safe sex campaigns and needle-exchange programmes. Another is the control of tobacco smoking. Changing smoking behavior requires long-term strategies, unlike the fight against communicable diseases, which usually takes a shorter period for effects to be observed. Many nations have implemented major initiatives to cut smoking, such as increased taxation and bans on smoking in some or all public places. Proponents argue by presenting evidence that smoking is one of the major killers, and that therefore governments have a duty to reduce the death rate, both through limiting passive (second-hand) smoking and by providing fewer opportunities for people to smoke. Opponents say that this undermines individual freedom and personal responsibility, and worry that the state may be emboldened to remove more and more choice in the name of better population health overall.

Simultaneously, while communicable diseases have historically ranged uppermost as a global health priority, non-communicable diseases and the underlying behavior-related risk factors have been at the bottom. This is changing however, as illustrated by the United Nations hosting its first General Assembly Special Summit on the issue of non-communicable diseases in September 2011.[12]

Many health problems are due to maladaptive personal behaviors. From an evolutionary psychology perspective, over consumption of novel substances that are harmful is due to the activation of an evolved reward system for substances such as drugs, tobacco, alcohol, refined salt, fat, and carbohydrates. New technologies such as modern transportation also cause reduced physical activity. Research has found that behavior is more effectively changed by taking evolutionary motivations into consideration instead of only presenting information about health effects. Thus, the increased use of soap and hand-washing to prevent diarrhea is much more effectively promoted if its lack of use is associated with the emotion of disgust. Disgust is an evolved system for avoiding contact with substances that spread infectious diseases. Examples might include films that show how fecal matter contaminates food. The marketing industry has long known the importance of associating products with high status and attractiveness to others. Conversely, it has been argued that emphasizing the harmful and undesirable effects of tobacco smoking on other persons and imposing smoking bans in public places have been particularly effective in reducing tobacco smoking.[13]

Applications in health care

As well as seeking to improve population health through the implementation of specific population-level interventions, public health contributes to medical care by identifying and assessing population needs for health care services, including:[14][15][16][17]

Assessing current services and evaluating whether they are meeting the objectives of the health care system

Considering the effect on resources for proposed interventions and assessing their cost-effectiveness

Supporting decision making in health care and planning health services including any necessary changes.

Informing, educating, and empowering people about health issues

Implementing effective improvement strategies

To improve public health, one important strategy is to promote modern medicine and scientific neutrality to drive the public health policy and campaign, which is recommended by Armanda Solorzana, through a case study of the Rockefeller Foundation's hookworm campaign in Mexico in the 1920s. Soloranza argues that public health policy can't concern only politics or economics. Political concerns can lead government officials to hide the real numbers of people affected by disease in their regions, such as upcoming elections. Therefore, scientific neutrality in making public health policy is critical; it can ensure treatment needs are met regardless of political and economic conditions.[18]

The history of public health care clearly shows the global effort to improve health care for all. However, in modern-day medicine, real, measurable change has not been clearly seen, and critics argue that this lack of improvement is due to ineffective methods that are being implemented. As argued by Paul E. Farmer, structural interventions could possibly have a large impact, and yet there are numerous problems as to why this strategy has yet to be incorporated into the health system. One of the main reasons that he suggests could be the fact that physicians are not properly trained to carry out structural interventions, meaning that the ground level health care professionals cannot implement these improvements. While structural interventions can not be the only area for improvement, the lack of coordination between socioeconomic factors and health care for the poor could be counterproductive, and end up causing greater inequity between the health care services received by the rich and by the poor. Unless health care is no longer treated as a commodity, global public health can ultimately not be achieved. This being the case, without changing the way in which health care is delivered to those who have less access to it, the universal goal of public health care cannot be achieved.[19]

Public Health 2.0

Public Health 2.0 is a movement within public health that aims to make the field more accessible to the general public and more user-driven. The term is used in three senses. In the first sense, "Public Health 2.0" is similar to "Health 2.0" and describes the ways in which traditional public health practitioners and institutions are reaching out (or could reach out) to the public through social media and health blogs.[20][21]

In the second sense, "Public Health 2.0" describes public health research that uses data gathered from social networking sites, search engine queries, cell phones, or other technologies.[22] A recent example is the proposal of statistical framework that utilizes online user-generated content (from social media or search engine queries) to estimate the impact of an influenza vaccination campaign in the UK.[23]

In the third sense, "Public Health 2.0" is used to describe public health activities that are completely user-driven.[24] An example is the collection and sharing of information about environmental radiation levels after the March 2011 tsunami in Japan.[25] In all cases, Public Health 2.0 draws on ideas from Web 2.0, such as crowdsourcing, information sharing, and user-centred design.[26] While many individual healthcare providers have started making their own personal contributions to "Public Health 2.0" through personal blogs, social profiles, and websites, other larger organizations, such as the American Heart Association (AHA) and United Medical Education (UME), have a larger team of employees centered around online driven health education, research, and training. These private organizations recognize the need for free and easy to access health materials often building libraries of educational articles.

Developing countries

There is a great disparity in access to health care and public health initiatives between developed nations and developing nations. In the developing world, public health infrastructures are still forming. There may not be enough trained health workers or monetary resources to provide even a basic level of medical care and disease prevention.[27] As a result, a large majority of disease and mortality in the developing world results from and contributes to extreme poverty. For example, many African governments spend less than US$10 per person per year on health care, while, in the United States, the federal government spent approximately US$4,500 per capita in 2000. However, expenditures on health care should not be confused with spending on public health. Public health measures may not generally be considered "health care" in the strictest sense. For example, mandating the use of seat belts in cars can save countless lives and contribute to the health of a population, but typically money spent enforcing this rule would not count as money spent on health care.

Large parts of the developing world remained plagued by largely preventable or treatable infectious diseases and poor maternal and child health, exacerbated by malnutrition and poverty. The WHO reports that a lack of exclusive breastfeeding during the first six months of life contributes to over a million avoidable child deaths each year.[28]Intermittent preventive therapy aimed at treating and preventing malaria episodes among pregnant women and young children is one public health measure in endemic countries.

Since the 1980s, the growing field of population health has broadened the focus of public health from individual behaviors and risk factors to population-level issues such as inequality, poverty, and education. Modern public health is often concerned with addressing determinants of health across a population. There is a recognition that our health is affected by many factors including where we live, genetics, our income, our educational status and our social relationships; these are known as "social determinants of health." A social gradient in health runs through society. The poorest generally suffer the worst health, but even the middle classes will generally have worse health outcomes than those of a higher social stratum.[29] The new public health advocates for population-based policies that improve health in an equitable manner.

Sustainable Development Goals

To address current and future challenges in addressing health issues in the world, the United Nations have developed the Sustainable Development Goals 2015 building off of the Millennium Development Goals of 2000 to be completed by 2030.[30] These goals in their entirety encompass the entire spectrum of development across nations, however Goals 1-6 directly address health disparities, primarily in developing countries.[31] These six goals address key issues in global public health: Poverty, Hunger and food security, Health, Education, Gender equality and women's empowerment, and water and sanitation.[31] Public health officials can use these goals to set their own agenda and plan for smaller scale initiatives for their organizations. These goals hope to lessen the burden of disease and inequality faced by developing countries and lead to a healthier future.

The links between the various sustainable development goals and public health are numerous and well established:

Living below the poverty line is attributed to poorer health outcomes and can be even worse for persons living in developing countries where extreme poverty is more common.[32] A child born into poverty is twice as likely to die before the age of five compared to a child from a wealthier family.[33]

The detrimental effects of hunger and malnutrition that can arise from systemic challenges with food security are enormous. The World Health Organization estimates that 12.9 percent of the population in developing countries is undernourished.[34]

Health challenges in the developing world are enormous, with "only half of the women in developing nations receiving the recommended amount of healthcare they need.[33]

Educational equity has yet to be reached in the world. Public health efforts are impeded by this, as a lack of education can lead to poorer health outcomes. This is shown by children of mothers who have no education having a lower survival rate compared to children born to mothers with primary or greater levels of education.[33] Cultural differences in the role of women vary by country, many gender inequalities are found in developing nations. Combating these inequalities has shown to also lead to better public health outcome.

In studies done by the World Bank on populations in developing countries, it was found that when women had more control over household resources, the children benefit through better access to food, healthcare, and education.[35]

Basic sanitation resources and access to clean sources of water are a basic human right. However, 1.8 billion people globally use a source of drinking water that is fecally contaminated, and 2.4 billion people lack access to basic sanitation facilities like toilets or pit latrines.[36] A lack of these resources is what causes approximately 1000 children a day to die from diarrhoel diseases that could have been prevented from better water and sanitation infrastructure.[36]

Education and training

Education and training of public health professionals is available throughout the world in Schools of Public Health, Medical Schools, Veterinary Schools, Schools of Nursing, and Schools of Public Affairs. The training typically requires a university degree with a focus on core disciplines of biostatistics, epidemiology, health services administration, health policy, health education, behavioral science and environmental health.[37][38] In the global context, the field of public health education has evolved enormously in recent decades, supported by institutions such as the World Health Organization and the World Bank, among others. Operational structures are formulated by strategic principles, with educational and career pathways guided by competency frameworks, all requiring modulation according to local, national and global realities. It is critically important for the health of populations that nations assess their public health human resource needs and develop their ability to deliver this capacity, and not depend on other countries to supply it.[39]

Schools of public health - a U.S. Perspective

In the United States, the Welch-Rose Report of 1915[40] has been viewed as the basis for the critical movement in the history of the institutional schism between public health and medicine because it led to the establishment of schools of public health supported by the Rockefeller Foundation.[41] The report was authored by William Welch, founding dean of the Johns Hopkins Bloomberg School of Public Health, and Wickliffe Rose of the Rockefeller Foundation. The report focused more on research than practical education.[41][42] Some have blamed the Rockefeller Foundation's 1916 decision to support the establishment of schools of public health for creating the schism between public health and medicine and legitimizing the rift between medicine's laboratory investigation of the mechanisms of disease and public health's nonclinical concern with environmental and social influences on health and wellness.[41][43]

Over the years, the types of students and training provided have also changed. In the beginning, students who enrolled in public health schools typically had already obtained a medical degree; public health school training was largely a second degree for medical professionals. However, in 1978, 69% of American students enrolled in public health schools had only a bachelor's degree.[37]

Degrees in public health

Schools of public health offer a variety of degrees which generally fall into two categories: professional or academic.[50] The two major postgraduate degrees are the Master of Public Health (M.P.H.) or the Master of Science in Public Health (MSPH). Doctoral studies in this field include Doctor of Public Health (DrPH) and Doctor of Philosophy (Ph.D.) in a subspeciality of greater Public Health disciplines. DrPH is regarded as a professional leadership degree and Ph.D. as more of an academic degree.

Professional degrees are oriented towards practice in public health settings. The Master of Public Health, Doctor of Public Health, Doctor of Health Science (DHSc) and the Master of Health Care Administration are examples of degrees which are geared towards people who want careers as practitioners of public health in health departments, managed care and community-based organizations, hospitals and consulting firms among others. Master of Public Health degrees broadly fall into two categories, those that put more emphasis on an understanding of epidemiology and statistics as the scientific basis of public health practice and those that include a more eclectic range of methodologies. A Master of Science of Public Health is similar to an MPH but is considered an academic degree (as opposed to a professional degree) and places more emphasis on scientific methods and research. The same distinction can be made between the DrPH and the DHSc. The DrPH is considered a professional degree and the DHSc is an academic degree.

Academic degrees are more oriented towards those with interests in the scientific basis of public health and preventive medicine who wish to pursue careers in research, university teaching in graduate programs, policy analysis and development, and other high-level public health positions. Examples of academic degrees are the Master of Science, Doctor of Philosophy, Doctor of Science (ScD), and Doctor of Health Science (DHSc). The doctoral programs are distinct from the MPH and other professional programs by the addition of advanced coursework and the nature and scope of a dissertation research project.

History

Early history

The primitive nature of medieval medicine rendered Europe helpless to the onslaught of the Black Death in the 14th century. Fragment of a miniature from "The Chronicles of Gilles Li Muisis" (1272-1352). Bibliothèque royale de Belgique, MS 13076-77, f. 24v.

Public health has early roots in antiquity. From the beginnings of human civilization, it was recognized that polluted water and lack of proper waste disposal spread communicable diseases (theory of miasma). Early religions attempted to regulate behavior that specifically related to health, from types of food eaten, to regulating certain indulgent behaviors, such as drinking alcohol or sexual relations. Leaders were responsible for the health of their subjects to ensure social stability, prosperity, and maintain order.

By Roman times, it was well understood that proper diversion of human waste was a necessary tenet of public health in urban areas. The ancient Chinesemedical doctors developed the practice of variolation following a smallpox epidemic around 1000 BC. An individual without the disease could gain some measure of immunity against it by inhaling the dried crusts that formed around lesions of infected individuals. Also, children were protected by inoculating a scratch on their forearms with the pus from a lesion.

In 1485 the Republic of Venice established a Permanent Court of supervisors of health with special attention to the prevention of the spread of epidemics in the territory from abroad. The three supervisors were initially appointed by the Venetian Senate. In 1537 it was assumed by the Grand Council, and in 1556 added two judges, with the task of control, on behalf of the Republic, the efforts of the supervisors.[54]

Modern public health

The 18th century saw rapid growth in voluntary hospitals in England.[55] The latter part of the century brought the establishment of the basic pattern of improvements in public health over the next two centuries: a social evil was identified, private philanthropists brought attention to it, and changing public opinion led to government action.[56]

1802 caricature of Edward Jenner vaccinating patients who feared it would make them sprout cowlike appendages.

The practice of vaccination became prevalent in the 1800s, following the pioneering work of Edward Jenner in treating smallpox. James Lind's discovery of the causes of scurvy amongst sailors and its mitigation via the introduction of fruit on lengthy voyages was published in 1754 and led to the adoption of this idea by the Royal Navy.[57] Efforts were also made to promulgate health matters to the broader public; in 1752 the British physician Sir John Pringle published Observations on the Diseases of the Army in Camp and Garrison, in which he advocated for the importance of adequate ventilation in the militarybarracks and the provision of latrines for the soldiers.[58]

With the onset of the Industrial Revolution, living standards amongst the working population began to worsen, with cramped and unsanitary urban conditions. In the first four decades of the 19th century alone, London's population doubled and even greater growth rates were recorded in the new industrial towns, such as Leeds and Manchester. This rapid urbanisation exacerbated the spread of disease in the large conurbations that built up around the workhouses and factories. These settlements were cramped and primitive with no organized sanitation. Disease was inevitable and its incubation in these areas was encouraged by the poor lifestyle of the inhabitants. Unavailable housing led to the rapid growth of slums and the per capitadeath rate began to rise alarmingly, almost doubling in Birmingham and Liverpool. Thomas Malthus warned of the dangers of overpopulation in 1798. His ideas, as well as those of Jeremy Bentham, became very influential in government circles in the early years of the 19th century.[56]

Public health legislation

Sir Edwin Chadwick was a pivotal influence on the early public health campaign.

The Poor Law Commission reported in 1838 that "the expenditures necessary to the adoption and maintenance of measures of prevention would ultimately amount to less than the cost of the disease now constantly engendered". It recommended the implementation of large scale government engineering projects to alleviate the conditions that allowed for the propagation of disease.[56] The Health of Towns Association was formed in Exeter on 11 December 1844, and vigorously campaigned for the development of public health in the United Kingdom.[62] Its formation followed the 1843 establishment of the Health of Towns Commission, chaired by Sir Edwin Chadwick, which produced a series of reports on poor and insanitary conditions in British cities.[62]

These national and local movements led to the Public Health Act, finally passed in 1848. It aimed to improve the sanitary condition of towns and populous places in England and Wales by placing the supply of water, sewerage, drainage, cleansing and paving under a single local body with the General Board of Health as a central authority. The Act was passed by the Liberalgovernment of Lord John Russell, in response to the urging of Edwin Chadwick. Chadwick's seminal report on The Sanitary Condition of the Labouring Population was published in 1842[63] and was followed up with a supplementary report a year later.[64]

Vaccination for various diseases was made compulsory in the United Kingdom in 1851, and by 1871 legislation required a comprehensive system of registration run by appointed vaccination officers.[65]

The Infectious Disease (Notification) Act 1889 mandated the reporting of infectious diseases to the local sanitary authority, which could then pursue measures such as the removal of the patient to hospital and the disinfection of homes and properties.[66]

In the U.S., the first public health organization based on a state health department and local boards of health was founded in New York City in 1866.[67]

Epidemiology

John Snow's dot map, showing the clusters of cholera cases in the London epidemic of 1854.

The science of epidemiology was founded by John Snow's identification of a polluted public water well as the source of an 1854 cholera outbreak in London. Dr. Snow believed in the germ theory of disease as opposed to the prevailing miasma theory. He first publicized his theory in an essay, On the Mode of Communication of Cholera, in 1849, followed by a more detailed treatise in 1855 incorporating the results of his investigation of the role of the water supply in the Soho epidemic of 1854.[68]

By talking to local residents (with the help of Reverend Henry Whitehead), he identified the source of the outbreak as the public water pump on Broad Street (now Broadwick Street). Although Snow's chemical and microscope examination of a water sample from the Broad Street pump did not conclusively prove its danger, his studies of the pattern of the disease were convincing enough to persuade the local council to disable the well pump by removing its handle.[69]

Snow later used a dot map to illustrate the cluster of cholera cases around the pump. He also used statistics to illustrate the connection between the quality of the water source and cholera cases. He showed that the Southwark and Vauxhall Waterworks Company was taking water from sewage-polluted sections of the Thames and delivering the water to homes, leading to an increased incidence of cholera. Snow's study was a major event in the history of public health and geography. It is regarded as the founding event of the science of epidemiology.[70]

Country examples

France

France 1871-1914 followed well behind Bismarckian Germany, as well as Great Britain, in developing the welfare state including public health. Tuberculosis was the most dreaded disease of the day, especially striking young people in their 20s. Germany set up vigorous measures of public hygiene and public sanatoria, but France let private physicians handle the problem, which left it with a much higher death rate.[75] The French medical profession jealously guarded its prerogatives, and public health activists were not as well organized or as influential as in Germany, Britain or the United States.[76][77] For example, there was a long battle over a public health law which began in the 1880s as a campaign to reorganize the nation's health services, to require the registration of infectious diseases, to mandate quarantines, and to improve the deficient health and housing legislation of 1850. However the reformers met opposition from bureaucrats, politicians, and physicians. Because it was so threatening to so many interests, the proposal was debated and postponed for 20 years before becoming law in 1902. Success finally came when the government realized that contagious diseases had a national security impact in weakening military recruits, and keeping the population population growth rate well below Germany's.[78]

United States

Most of the Public health activity in the United States took place at the municipal level before the mid-20th century. There was some activity at the national and state level as well.[79]

In the administration of the second president of the United States John Adams, the Congress authorized the creation of hospitals for mariners. As the U.S. expanded, the scope of the governmental health agency expanded.

Public health nursing made available through child welfare services in U.S. (c. 1930s)

In the United States, public health worker Sara Josephine Baker, M.D. established many programs to help the poor in New York City keep their infants healthy, leading teams of nurses into the crowded neighborhoods of Hell's Kitchen and teaching mothers how to dress, feed, and bathe their babies.

Another major public health improvement was the decline in the "urban penalty" brought about by improvements in sanitation. These improvements included chlorination of drinking water, filtration and sewage treatment which led to the decline in deaths caused by infectious waterborne diseases such as cholera and intestinal diseases.[80]
The federal Office of Indian Affairs (OIA) operated a large-scale field nursing program. Field nurses targeted native women for health education, emphasizing personal hygiene and infant care and nutrition.[81]

Latin America

Public health issues were important for the Spanish empire during the colonial era. Epidemic disease was the main factor in the decline of indigenous populations in the era immediately following the sixteenth-century conquest era and was a problem during the colonial era. The Spanish crown took steps in eighteenth-century Mexico to bring in regulations to make populations healthier.[82] In the late nineteenth century, Mexico was in the process of modernization, and public health issues were again tackled from a scientific point of view.[83][84][85][86] Even during the Mexican Revolution (1910–20), public health was an important concern, with a text on hygiene published in 1916.[87] During the Mexican Revolution, feminist and trained nurse Elena Arizmendi Mejia founded the Neutral White Cross, treating wounded soldiers no matter for what faction they fought. In the post-revolutionary period after 1920, improved public health was a revolutionary goal of the Mexican government.[88][89]

Public health was important elsewhere in Latin America in consolidating state power and integrating marginalized populations into the nation-state. In Colombia, public health was a means for creating and implementing ideas of citizenship.[90] In Bolivia, the push came after their 1952 revolution.[91]

↑ Milton Terris, "The Profession of Public Health", Conference on Education, Training, and the Future of Public Health. March 22–24, 1987. Board on Health Care Services. Washington, DC: National Academy Press, p. 53.