Children's Health: A Brief History

Jan Richter

December 8, 2004

Infant Deaths Common for Centuries
Most of what we know about child health through the history of the North American continent dates from after the arrival of European settlers. One thing is clear: among Indians, settlers, and slaves, the period from first contact until the late 1800s was one of high child mortality.

Until well into the 19th century, parents—rich and poor—were helpless to prevent or combat fatal infections and diseases that took away many of their children. Cotton Mather, for example, was born in 1663 and became one of the most prominent citizens in the Massachusetts Bay Colony. Of his 15 children, only two survived into adulthood. Overall child mortality rates were even higher among the Virginia colonies, where the climate contributed to disease, than in Massachusetts.

The death rate among black slave infants was exceptionally high, probably due to poor nutrition, crib death, and poor prenatal and postnatal care among other factors.

Indian child mortality was probably fairly high even before Europeans arrived, but it was pushed higher—to an estimated 50 percent mortality rate—by epidemics of European-introduced infectious diseases like smallpox, to which the tribes had no immunity.

Public Health Efforts Turn the Tide
Starting in the last quarter of the 19th century, doctors and scientists began to gain a much better understanding of the sources of illness. That knowledge helped drive public health efforts, like "Safe Milk Campaigns" to pasteurize milk and dispense free milk in some cities. In the 1880s, public campaigns encouraged doctors to put silver nitrate in newborns eyes to prevent blindness. Compulsory vaccination programs also had their roots in the late 1800s, such as the 1897 New York law mandating children be vaccinated against smallpox.

These public health efforts led to a steep decline in infant and child mortality, at least among European settlers and their descendants. By the end of the nineteenth century, most parents could assume that their babies would remain healthy and survive. As a consequence, birth rates among the middle and upper classes began to decline and average family size decreased. In the 1600s, the average family had eight children. By 1900, that had fallen to four.

Healthier Cities, Better Medicine
Urban public health improvements of the late 19th and early 20th centuries, like improved sanitation services and treated municipal water systems, made the cities healthier for growing children. Successive campaigns to find ways to prevent catastrophic diseases, like whooping cough, diphtheria, cholera, yellow fever, and polio, helped bring the threat of childhood diseases under control.

As a result, by the late 20th century unintentional injuries—primarily traffic and gun injuries—had become the leading cause of death for children and youth past the age of one year. While these changes have been universal, the benefits have not been uniform: families of color have experienced persistently higher mortality rates for infants, children, teens, and mothers than white families.

Shifting Role of Government
Private charities and public programs both have played vital roles in improving child health and reducing mortality rates. But public support for a federal role in child health has been intermittent for much of our history, reflecting disagreement over the proper roles of private philanthropy, the federal government and the individual states.

In the early 20th century, public attitudes toward the role of government began to shift. Progressives began framing social problems not as the result of individual failings, but as the result of social conditions. From this perspective, solutions lay not in improving the moral fiber of parents but in addressing community needs with collective actions. In the early decades of the 20th century, the activists of the Progressive Era sought government resources to improve children's health along with a broad array of public health and social services designed to improve living conditions, especially among low-income and immigrant urban families.

The Children's Bureau, a federal agency established in 1912, proclaimed the principle that all children had a right to health care, including dental care. But the Children's Bureau's call for universal health care for children was never realized, as support for public contributions to a health care system for children waxed and waned over the decades.

When women got the vote in 1920, the progressive impulse for government activism was strengthened by an expectation among elected officials that women would vote as a block in favor of public services and programs geared to helping families with children. The landmark Sheppard-Towner legislation of 1921 gave the federal government a leadership role in providing services for individual citizens, reversing the longstanding Pierce doctrine of laissez-faire.

Sheppard-Towner established federal matching grants to states in support of maternal and child health. At a time when many states lacked state health departments or even a system for issuing birth certificates, Sheppard-Towner clinics staffed by public health nurses gave young mothers access to professional advice about their babies' health and nutrition.

But Sheppard-Towner lost political support when the expected "women's vote" failed to materialize. Public aversion to government funding for social programs and opposition from the American Medical Association (though not the American Pediatric Association) contributed to the repeal of the act in 1929. On the eve of the Great Depression, finding ways to provide health care services for children whose families were isolated or impoverished was left in the hands of volunteer agencies and private charities.

Unable to fully match the need during good economic times, the resources of private charities were quickly overwhelmed. While some states tried to supplement private philanthropy, many agencies soon ran out of money.

The exception was the state of Michigan, where, in 1929, Senator James Couzzens gave $10 million to establish the Children's Fund with the unusual requirement that all the money be spent within 25 years. The Michigan Children's Fund continued to finance maternal and infant health clinics throughout the Depression. The Fund also supported basic research in children's nutrition and gave money for school lunch programs. Contrary to the other states, infant mortality rates in Michigan declined during the Depression.

America's entry into World War II in 1941 helped speed public investments in industry, spurring the country's economic recovery. The return to prosperity helped improve the health and wellbeing of children, which had declined during the Depression of the 1930's. New federal programs aimed at supporting the military also benefited children's health.

In 1943, a new emergency maternal and infant care program was instituted to provide for families of servicemen engaged in World War II. But the program was disbanded by Congress as soon as the war was over.

Another federal program instituted in response to military needs was not so short-lived: The National School Lunch Program (NSLP), was created by Congress in 1946 (in timeline) as a "measure of national security." It was a direct response to the fact that many of the young men responding to the World War II draft were rejected due to conditions arising from serious nutritional deficiencies. Today, the School Lunch Program provides nutritionally balanced, low-cost or free lunches to more than 25 million children each school day.

For many families raising children in the first half of the twentieth century, the fear of polio was intense. By the early 1950's, iron lungs were keeping children paralyzed by polio alive in communities throughout the country. The fact that President Franklin Roosevelt had been stricken with polio as an adult helped spark a nationwide volunteer fundraising effort to finance research into the disease: the March of Dimes was organized by Basil O'Connor, a close friend of Roosevelt. Decades later, the research paid off with the development of a useable polio vaccine in the late 1950s.

Improvements in children's health and survival over the last century have been dramatic. Mortality rates for children older than age one declined considerably during the 20th century, due in large part to advances in medical technology, improved socioeconomic conditions, and improvements in water and food safety and sanitation practices. It is estimated that 1.3 percent of children born in 2000 will die before they reach the age of 20, compared to 10.9 percent of children in the early 1930s.

But the gains have not been shared equally among all children. Lead poisoning, tooth decay and infection, poor nutrition and hunger, chronic asthma, and gun injuries, for example, all afflict poor children at higher rates and with greater severity than middle-class and affluent children. Black children are almost twice as likely as white children to die before reaching age 20. Children living in poverty die at a rate two and a half times the rate of wealthier children.

While unintentional injuries are still the leading causes of death among children after infancy, children's health is becoming increasingly endangered once again by diseases and other health-related conditions. Rising rates of obesity among children that put them at greater risk for heart disease and diabetes, the increasing threat of incurable diseases like AIDS, and the toxic effects of environmental hazards on young children's development are among the challenges that threaten children's health in the beginning of the twenty-first century.

While I love animals and am very much an animal advocate it saddens me that the abuse and neglect of our animals in America and around the world gets more attention in the media than that of our children. Or even the fact that American children are growing up homeless and starving as much as are animals.