Published in the [link href='/news-politics/a24807/october-2013/' target='_blank' link_updater_label='internal_full']October 2013 80th Anniversary

issue

The Broken illuminate the unbroken.

I wrote that line a couple years ago, in an article about a man named Henry Molaison, whose brain my grandfather partially destroyed. The destruction was well-intentioned: My grandfather was a neurosurgeon, and Henry came to him in the early 1950s suffering from a terrible case of epilepsy. My grandfather decided that the best way to treat Henry would be to perform an experimental resection of a large portion of his medial temporal lobes, including a small seahorse-shaped structure called the hippocampus. Unfortunately for Henry, almost as soon as he exited the OR, it became clear that you need your hippocampus to record the passing events of your life. Henry lived the next half century in more or less five-minute increments, the present always sliding away.

Henry's tragedy was a gift to the rest of us. "Patient HM," as he was known publicly until his death in 2008, became the most studied individual in the history of neuroscience, and his amnesia taught us an astonishing amount about how memory works. Scientifically speaking, he was the descendant of a host of other fascinatingly damaged men and women. Some were household names, like Phineas Gage, the railway worker whose frontal lobes and self-restraint were speared by an iron rod. Others were less well known, like Louis Leborgne, often referred to as Monsieur Tan because his damaged posterior left inferior frontal gyrus left him unable to formulate any words except one, over and over: tan, tan, tan, tan…. During the 1800s and 1900s, most of the fundamental advances in our understanding of how brains work were made through the study of individuals whose brains did not.

In recent decades, though, the bulk of the funding from the NIH and other major grant-giving organizations has shifted to studies based on scanning healthy research subjects with fMRI or other new neuroimaging technologies. And now, two huge moon-shot-type projects, Obama's billion-dollar BRAIN Initiative and its European analog, the Human Brain Project, are promising to someday allow us to entirely map and even simulate the brain, diminishing the need for human subjects.

In announcing his decade-long initiative, the president described "the three pounds of matter that sits between our ears" as "this enormous mystery waiting to be unlocked" and declared "the BRAIN Initiative will change that by giving scientists the tools they need to get a dynamic picture of the brain in action." He mentioned Alzheimer's, Parkinson's, and epilepsy as just some of the neurological scourges that might be conquered.

Then again, they might not. It's worth remembering that not long ago a different president announced "a new era of discovery is dawning in brain research," fueled by "advances in brain-imaging devices" that would allow us to do things like "mapping the brain's biochemical circuitry." He mentioned Alzheimer's, Parkinson's, and epilepsy as some of the neurological scourges that might soon be conquered, and he seemed so confident that tech-fueled breakthroughs were on the immediate horizon he made a solemn proclamation: "Now, therefore, I, George Bush, president of the United States of America, do hereby proclaim the decade beginning January 1, 1990, as the Decade of the Brain."

Who knows, maybe Obama's decade of the brain will be more fruitful than Bush's. Regardless, it'd be a shame if the current funding trend — throwing money at impersonal, technology-centered research while choking off more personal approaches — holds. Because while the current pittance doled out to studies of unusual brains might make you think they're a methodological dead end, they're not. The present-day descendants of Henry are as illuminating as he was. People like Patient SM, a 46-year-old woman in the Midwest with a broken amygdala and no sense of fear, continue to shake up our understanding of ourselves. No matter how advanced our technology becomes, no matter how good we get at creating digital simulacra of the brain, we should never ignore what these flesh-and-blood curiosities can teach us. Their brains, each tantalizingly incomplete, are as irreproducible as they are irreplaceable.

A Part of Hearst Digital Media
Esquire participates in various affiliate marketing programs, which means we may get paid commissions on editorially chosen products purchased through our links to retailer sites.