The tools of medicine

There’s more than one way to secure an airway. Emergency medicine residents and paramedics learn early about the tools at our disposal to ensure that air gets into a patient’s lungs. In my residency, we called these tools airway toys, especially when we tried them out at conferences. The right toy could save a life, and there were always newfangled ones coming along.

So it goes throughout medicine today, with its explosion of new medical devices and the increasingly ubiquitous computer. Biologically, our bodies are largely structured as they were in the Neolithic age, when practitioners used stone blades to carve holes in the skulls of patients—who often survived the procedure. But our instruments are now infinitely more complex, our era the most tech-saturated in history. Many of us owe our lives to the tools of medicine—including those that have brought unforeseen consequences.

Diagnostic devices, for example, have quietly and profoundly changed the doctor-patient relationship. Such devices haven’t been around long, and the first ones were comparatively simple. René Laennec’s 1816 stethoscope was a wooden tube; it impressed many, not least those who were shy about laying an ear on the chests of buxom women, but some physicians continued to rely on the old technique and on feeling the patient’s pulse. In 1868, Carl Wunderlich published a landmark work on monitoring body temperature—a proposal that did not go over well with some of his German colleagues steeped in a philosophical rather than observational approach to medicine. Hermann von Helmholtz used his ophthalmoscope to observe the retina in 1850, wowing colleagues the following year at the Great Exposition in London; in 1868, John Aylwin Bevan wrote to The Lancet that he had “discovered” a candle-powered esophagoscope, with which “morbid growths, &c., can be clearly seen.” In the meantime, laboratory medicine was beginning to flower, thanks in part to the mid-19th-century adoption of the medical microscope and advancements in histology over the latter half of that century. Scipione Riva-Rocci gave the world the sphygmomanometer for measuring blood pressure in 1896, and Willem Einthoven introduced the electrocardiograph—it weighed 600 pounds and required five assistants to operate it—in 1902. Wilhelm Roentgen’s X-rays debuted in 1895, and dominated medical imaging until the 1950s, 60s, and 70s saw the first ultrasounds, CT scans, and MRIs.

Is diagnosis a zero-sum game?

Today, among practitioners trained in modern imaging and lab tests, practicing without them can seem almost unthinkable. In 2005, doctors in the United States ordered an estimated 1 billion needle punctures of veins for blood sampling, and by 2007 they were ordering 80 million CTs a year.

Such wizardry can make diagnosis seem like a zero-sum game. Why cultivate the difficult orally transmitted art of physical diagnosis when it’s so much quicker and more accurate to order a study? In the process, costs go up, along with such unintended consequences as radiation-induced cancers from too many CT scans. Or the patient can find himself parceled, body part by body part, system by system, amongst specialists. Or “incidentalomas” pop up, unexpected findings that may not truly warrant treatment, but that can cause anxiety and even harm from further testing. There is also a loss of resiliency, something any tech-saturated physician discovers if she tries practicing in a developing country and realizes she’s forgotten the difference between S3 and S4 heart sounds. Back at home, we may be just a power outage or hurricane away from discovering how much our effectiveness depends on these technologies.

In fact, older physicians like Irwin Braverman, M.D. ’55, HS ’56, professor emeritus of dermatology, are concerned by what they see as a loss of physical diagnosis acumen among today’s house staff and attendings—a phenomenon historians of technology call deskilling. When Braverman trained, the most admired senior physicians were highly observant and sharply deductive. With X-rays their main confirmation tool, they had to be. “Physicians were using all their senses plus whatever images they had to pull things together. They were actually pretty good at it,” Braverman says. Today, by contrast, he says, “the doctors’ cognitive skills keep declining because they rely so much on technology.” (Ironically, at least one group of technologies—simulators—can be used to train students in old-fashioned physical diagnosis skills.)

The skills humans bring to the task

At worst, too much specialty testing can lead to a potentially fatal valuing of abstraction over examination, a sort of cognitive abandonment of the patient. Thomas P. Duffy, M.D., professor emeritus of medicine, recalls a patient with fever, back pain, and neurological findings whose physical exam suggested an epidural abscess—an emergency that requires immediate treatment. Yet that treatment was delayed by the providers’ wish to schedule an MRI to confirm the diagnosis. Somehow, the diagnosis seemed not to exist until a machine said it did.

Yet these tests can uncover crucial information that no physical exam ever could. The question, says historian Joanna Radin, Ph.D., is how to combine knowledge from scan results and the expertise of an experienced physician, rather than reflexively valuing one kind of knowledge over the other.

“There are many, many examples of the ways in which technology has been introduced in a way that doesn’t honor the skill and the richness of knowledge that humans bring to a task,” says Radin, assistant professor in the history of medicine, of anthropology, and of history. “Which isn’t to say that we should reject technology—but it’s worth considering what ways we want to deploy technology.”

When old practices fall by the wayside

In the 1980s, amid bitter controversy, laparoscopy transformed surgery. Inspired by gynecologists who were employing the technique for tubal ligations in the late 1970s, some pioneering surgeons tried it a few years later for gallbladder removal and were vilified by their colleagues. What was then the standard open surgical technique for gallbladder removal had remained essentially unchanged since the 1940s. Ultimately, though, laparoscopic cholecystectomy became the gold standard for uncomplicated gallbladder disease, driven by patient demand and studies finding advantages like less postoperative pain and shorter hospital stays.

Those attending surgeons who did make the switch had to learn laparoscopy on the job; one study in Germany found that this catching up meant temporarily depriving trainees of available cases.

There’s another problem, one that runs parallel to what physicians like Braverman are noting: these days, so many routine cholecystectomies are done via laparoscopy that younger surgeons seldom use the open technique any longer. If complications force them to “convert” to the once-routine open procedure, the surgeons’ inexperience could lead to patient harm. In 2012, a group of Harvard surgeons wrote that for that very reason, open cholecystectomy is no longer the safe alternative it once was. Last year, this situation led a team of London historians to reenact 1980s-era open cholecystectomies in a period operating suite with the aid of retired surgeons and nurses, in the hope of preserving knowledge of the older technique.

Technology is neither good nor bad—nor neutral

A koan-like axiom of the history of technology proposed by historian Melvin Kranzberg, Radin adds, holds that technology is neither good nor bad—nor is it neutral. Its value is all in how you roll it out.

There might be no more heated debate about that idea today than the one that is taking place over electronic health records (EHRs). As practitioners and hospitals across the country scramble to switch from paper records to meet federal benchmarks, some commentators are pointing out unintended consequences. The promise of convenient record-sharing and streamlined billing, not to mention access to clinical data for studies, has been offset by design flaws (like defaulting to the wrong measurement units); high cost; and the alienating, time-consuming task of data entry. It’s a less than ideal doctor-patient encounter, after all, if the doctor is chiefly communicating with a computer monitor.

Yet EHRs, Radin says, can offer an opportunity “to bring the technology more in alignment with what makes a physician an excellent physician.” For instance, a system may automatically rate physicians for speed, forcing them to hurry through history-taking. What if, Radin asks, those same programs rated the physicians for the number of adjectives they chose to describe the patient, guiding them to record a more richly descriptive history?

Lisa Rosenbaum, M.D., made a similar point in an October 22, 2015 New England Journal of Medicine perspective piece.

“We measure many things that have no value to patients, while much of what patients do value, including our attention, remains unmeasurable,” Rosenbaum wrote. “The technology will support and improve medical care only if it evolves in ways that help, rather than hinder, us in synthesizing, analyzing, thinking critically, and telling the stories of our patients.”

Better alignment of technology in this way requires not only conversations between users and designers, but also paying attention to what’s being lost, Radin says. “Technology is always about changing human relationships,” she says. In the humble stethoscope, there is precedent for such thoughtful design. In its modern form, says Duffy, the stethoscope’s tubing isn’t the optimum length for good acoustics: if it were, it would be shorter. Rather, it represents a compromise between good acoustics and the need to honor the patient’s personal space. The modern stethoscope can create an intimate yet respectful encounter, one that preserves the solemnity and grandeur of professional mastery.

During auscultation, Duffy says, “No matter what I hear, the patient is fascinated and always asks me, ’What did you hear, doctor?’ … This is the beginning of a wonderful trust, because of the hovering, listening, [and] attention. It’s quiet, and it has a magic all its own. We shouldn’t give that up.”

In the mid-19th century, Hermann von Helmholtz invented the ophthalmoscope, which allowed physicians to observe the retina.

In the mid-19th century, Hermann von Helmholtz invented the ophthalmoscope, which allowed physicians to observe the retina.