Long-lasting American Indian Health Disparities: An Interview with Dr. David Jones

INTERVIEW BY MOHAMMAD ZIAD ABDULGHANI

American Indians in the United States have faced health disparities for over 500 years. Dr. David Jones, Harvard University A. Bernard Ackerman Professor of the Culture of Medicine, has been a key advocate challenging immodest claims of causality regarding American Indian health. His initial research focused on epidemics among American Indians, resulting in a book, Rationalizing Epidemics: Meanings and Uses of American Indian Mortality since 1600 (published by Harvard University Press in 2004), and several articles. Jones has also examined human subjects research, Cold War medicine, HIV and other sexually transmitted infections, and the history of cardiac surgery. The first book from this work, Broken Hearts: The Tangled History of Cardiac Care (Johns Hopkins University Press, 2013) examines why it can be so difficult for physicians to determine the efficacy and safety of their treatments. He is now at work on two follow up books. One, On the Origins of Therapies, will trace the evolution of coronary artery bypass surgery. The other examines the history of heart disease and cardiac therapeutics in India.

Previously, Dr. Jones had been appointed as a MacVicar Faculty Fellow, MIT’s highest honor for faculty who have made sustained contributions to undergraduate education. He also taught as a lecturer in the Department of Global Health and Social Medicine at Harvard Medical School, where he was awarded the 2010 Donald O'Hara Faculty Prize for Excellence.

Mohammad Ziad Abdulghani, Co-Managing Editor at HHPR, conducted this interview with Dr. David Jones to inquire about why American Indian health disparities have persisted for so long.

Mohammad Abdulghani (MA): Common arguments of the rapid population decline of American Indians center around strong susceptibility to European diseases? How would you argue against the genetic susceptibility of American Indians?

David Jones (DJ): For any kind of health disparity between two populations, there are three basic explanatory options. Option one: some trait has left them more susceptible to the disease, something intrinsic or inherent in their bodies. When we discuss inherent causes of susceptibility now, we usually talk about genetics. In the seventeenth and eighteenth centuries, observers often emphasized religious causes of susceptibility.

Option two: There is nothing wrong with the people per se that leaves them vulnerable.
Instead, there is something harmful about the environment in which they live. In colonial times, observers focused on the physical environment (imagine the health risks of a winter in New England or South Dakota without modern houses, clothing, or technology!). By the 20th century, most of the focus is on poverty as an underlying cause of disease and disparities.

Option three: People’s behavior. Whatever environment people live in, they make choices that can leave them vulnerable to disease. In the 18th century, missionaries blamed Indian diseases on their diet or exposure to the environment. They often blamed Indian’s practice of sitting in a sweat house for hours and then jumping into cold water (keep in mind that the English rarely bathed). When doctors now worry about behavior, they focus on alcohol, diet, or cigarettes.

It can often be difficult to discern the line between behavior and environment, especially since people’s environments can constrain the choices they make about health behaviors. Is it fair to say that poor people make bad choices about nutrition? They often don’t have the resources or opportunities to make better ones.

It is important to work hard to figure out the best explanations of health inequities because it is these explanations that motivate policy interventions. If the cause of a health disparity is some intrinsic difference between the populations, it is not always easy to figure out what a useful policy intervention might be. But if the problem is environmental, you can try to remediate the situation.

This always becomes politicized. If you look at history, you will see that observers have always identified multiple possible causes of any health disparity. They then have latitude to choose which causes to emphasize. Time and time again, they have emphasized the one that is most useful or expedient.

MA: How has the misrepresentation of American Indian susceptibility to disease worsened health conditions for the Indians and served to justify U.S. colonialism?

DJ: When John Winthrop and his colleagues planned the Puritan migration to Massachusetts in 1630, they wrote legal briefs to justify their right to settle the land. One of their key arguments was that the Indians had begun to suffer epidemics, which the English interpreted as a means by which God cleared the land for English settlement. By interpreting the epidemics as divine judgment, they justified their colonizing mission. Once the colonists arrived in Massachusetts, many saw the epidemics not just as inevitable (and divinely ordained), but as a blessing. It is easy imagine that this would have compromised any motivation to intervene to care for the suffering Indians.

Similarly, in the late nineteenth century, the federal government had to justify the confinement of American Indian populations onto the reservations. Many officials saw the world in terms of a conflict between the white, red, and black races. They had little doubt in their own (white) superiority and in the inevitable extinction of both American Indians and African Americans. They interpreted dire outbreaks of tuberculosis and other infections as evidence of Indian inferiority. They believed that the “Indian problem” would be solved by 1920 as Indians either died out or were assimilated into white populations. In this racist worldview, the reservation system was envisaged simply as palliative care for a dying race.

Of course American Indians did not die, and their populations have instead increased over the 20th century. This has left many of them mired in poverty, in large part because they are stuck living on reservation lands that were never meant to sustain these populations in the long-run. Most epidemiologists now see the ensuing impoverishment of American Indians as the root cause of their health inequities today. And 19th century federal Indian policy is the root cause of that poverty.

MA: American Indians have faced many adversities over the past 500 years. Why have Indian Health Disparities persisted for so long?

DJ: After reviewing the long history of Indian health inequities, I came to the conclusion that the most likely explanation of the persistence of health disparities was the persistence of the underlying social and economic disparities. Although many European colonists struggled in the early years of settlement, they quickly established political and economic regimes that left American Indians relatively disadvantaged. Disparities in health status soon followed, whether with smallpox in the eighteenth century, tuberculosis in the nineteenth century, or diabetes and substance use today.

Over this time there have been recurring efforts by Europeans and then white Americans to do something to intervene. While it is not always possible to decipher the motivation of historical actors, their acts are often quite suggestive. In the midst of a smallpox epidemic in New England in 1633-1634, some colonists did provide nursing care to dying Indians and adopt orphaned children. Thomas Jefferson, who knew well the toll of smallpox on American Indians, made repeated efforts to introduce vaccination (which had just been developed by Edward Jenner) to the Indians. He instructed Lewis and Clark to vaccinate the tribes they encountered as they explored the Louisiana Purchase. Tragically the vaccine material they took with them was inactive; they had to abandon this scheme. The history of the interior of North America might have been very different had Jefferson’s plan succeeded.

When the government confined tribes onto reservations in the 19th century, it often promised to provide medical care. The government, however, rarely followed through on this promise. Indian health care in the late 19th and early 20th centuries was always underfunded and often ineffective. The government attempted to improve the situation when it created the Indian Health Service in 1955. While the IHS, with many talented and dedicated staff, has helped to bring about dramatic improvements in Indian health, many health inequities persist today.

This creates an interesting challenge for historical judgment: do you give the federal government credit for providing health services that have contributed to substantial improvements, or do you critique it for failing to achieve health equity? Both positions are justified.

MA: How have American Indians been blamed for their health disparities?

DJ: One of the best examples is provided by diabetes. In the 1960s, medical researchers singled out the Pima Indians (now known as the Akimel O'odham) as the population with the highest rate of diabetes in the world. Researchers assumed that the cause must be genetic. This led to decades of research to understand the genetic basis for Pima diabetes. Over the ensuing decades, however, high rates of disease have emerged in other American Indian groups and, in fact, in many populations worldwide, especially those who have experienced poverty, marginalization, rapid social and cultural changes, or other factors that can cause stress or disrupt patterns of diet and activity. While there certainly are genetic factors that influence the risk of diabetes, most people now assume that the root causes of the epidemic of American Indian diabetes is social, not genetic, and that many of the social risk factors are the result of a combination of economic marginalization and regrettable federal policies. Had this been recognized and emphasized in the 1960s, it is possible that useful policies could have been introduced that would have done more good for the Akimel O’odham than decades of genetic research.

MA: U.S. policy has shifted over the years regarding healthcare for American Indians has shifted over the years. President Andrew Jackson had the forced removal of Indians during the Trail of Tears and many Indians nowadays live on U.S. reservations? What is the current state of U.S. for American Indians?

DJ: As with any population, it’s hard to generalize. If you look at everyone who identifies as American Indian as a single group, then they have the worst health outcomes of any of the official U.S. census race-ethnic categories in the United States. That’s obviously an enormous problem. If you increase the resolution a little bit, you start to see differences. Health status (both overall health, and specific disease patterns) varies significantly between different Indian populations. Urban American Indians tend to have worse health than rural Indians who remain on reservations. It is not clear whether this is a consequence of urbanization or lack of access to health care (most IHS services are focused on the reservations).

Health status often depends on where a tribe lives. Some tribes have reservation lands near major population centers (e.g., certain tribes in Minnesota, Arizona, or California). These groups have sometimes been successful at opening casinos, experiencing financial windfalls that have transformed the prospects of tribe members. Health status will presumably improve if this economic success persists. However, many of the largest tribes, such as the Navajo or the Sioux, live in remote reservations or have chosen not to develop casinos. The poorest counties in the United States are now in South Dakota, at the Pine Ridge and Rosebud Reservations of the Oglala Sioux. These are also the counties with the lowest life expectancy.

MA: You have argued that Indian Health has become a laboratory for global health? Why do you feel that this is the case?

DJ: The history here is complicated, but I will attempt to summarize it. As the Cold War revved up in the 1950s, the US and the USSR competed for allies on a global stage. Most countries were asked to pick sides. The USSR offered communism and the promise of redistribution of land and wealth, something that would have been very appealing for tenant farmers living in rural poverty. The US couldn’t offer redistribution—that would be communist. Instead the US typically offered technical aid programs, especially education and health care. They argued that investments in these areas, as well as in others types of infrastructure, would lead to economic development.

This raised an obvious question: what kind of health care would actually be most useful for people living in rural poverty? Coming out of World War II, there were many new technologies—antibiotics, immunizations, DDT—that gave people hope that the world could be made into a better and healthier place. But how could these and other health technologies be distributed to the world’s poor? The US did not have a lot of experience with this, as it had often neglected its own rural poor. Federal officials were actively trying to figure this out in the 1950s.

At this time, a group of researchers from New York Hospital stumbled across the epidemic of tuberculosis on the Navajo Reservation (one of the researchers had been sent to Arizona to investigate an outbreak of hepatitis; he did that, but was much more concerned about the tuberculosis problem). These researchers happened to be testing new antibiotics for tuberculosis, and they rushed out to Arizona to test these antibiotics on the Navajo (with the enthusiastic support of the Navajo tribal council). This research was a success. Having established a good collaborative relationship with the Navajo, the researchers proposed a follow up study: implementing a comprehensive health care system in Many Farms, a small community in a remote part of the reservation where no good health care had existed. They saw this as a chance to test the ability of modern medicine to improve the health of people living in rural poverty. They hoped that the research would lead to improvements in health care on the Navajo Reservation. The researchers—and the federal government—also realized that anything they learned with the Navajo could be applied to international health projects worldwide. The Navajo, the researchers argued, provided a laboratory in which techniques of global health could be developed and tested.

The health care experiment at Many Farms ran for six years. In the end the researchers realized that modern medicine, for the most part, could do little for the major health threats facing the population (especially for child mortality). What the Navajo really needed was running water, safe housing, and economic development. But despite the more or less negative results of the trial, the researchers soon found themselves to be in high demand as consultants for international health. The basic tension found at Many Farms persists today: if you want to improve the health of a population in rural poverty, is it sufficient to provide them with access to decisive medical technology (e.g., antiretrovirals for HIV, antibiotics for tuberculosis, etc.), or should you invest in education and economic development? Obviously it is best to do both, and some programs (notably Partners in Health, co-founded by Harvard’s Paul Farmer) have done that. But in too many cases, whether in Indian country or around the world, that has not yet been done.

Primary care is critical for the improvement of healthcare overall. Health technology and telehealth are important for success in primary care environments. This article focuses on clinical transformations in technology, to improve the current clinical environment.

Today, people are living healthier lives and longer than ever. Improvements in living conditions and public health measures such as immunizations have saved millions of lives. Yet this progress is unequally distributed both within and between countries. More than half of the world’s population still lack access to PHC.

American Indians in the United States have faced health disparities for over 500 years. Dr. David Jones, Harvard University A. Bernard Ackerman Professor of the Culture of Medicine, has been a key advocate challenging immodest claims of causality regarding American Indian health.

Across the world, two and one half billion people live with uncorrected vision, 80% of whom reside in low resource settings. Beyond the cost of not being able to see the world clearly, uncorrected refractive errors (a major source of uncorrected vision) cost a global $227 billion dollars in lost productivity per year. Currently, there exists one solution that has yet not been explored which has the potential to radically lower the cost of corrective eyewear and leap across the urban-rural divide: pinhole glasses.

Dr. Suzanne Koven, primary care physician and Writer in Residence at Massachusetts General Hospital, discusses narrative medicine and the increasingly popular use of storytelling to benefit both patients and healthcare providers.

Social Media (#SoMe) has become a global phenomenon with more than 73% of adults actively engaged online. Specific to healthcare, these applications are being included with ever increasing frequency as a complement to both patient treatment and medical training. Furthermore, #SoMe has permitted medical innovators to transcend traditional limits and collaborate via methods previously unexplored. These platforms will only become more influential in the healthcare sector as more people around the world gain internet access.

The need to rehabilitate American infrastructure such as roads, bridges, and water systems is well recognized. These services are used daily by millions and impact the economy, health, and commerce of America. Likewise, primary care needs rehabilitation, investment, and much more public policy attention.