10 Scientists Who Were Their Own Guinea Pigs

The Scottish author Robert Louis Stevenson gave us a fairly succinct cautionary tale against self-experimentation when he published "The Strange Case of Dr. Jekyll and Mr. Hyde" in 1886. In the novel, Dr. Henry Jekyll, a law-abiding and generally mild-mannered member of society, experiments on himself with a concoction that turns him into an amoral and violent version of himself, the repugnant and murderous Mr. Hyde.

With only himself as both test subject and experimenter, Jekyll loses control of his experiment and finds he's transforming into Hyde without the aid of the drug. As investigators close in on his secret, he takes his own life. At least one moral of the story is pretty clear: Don't use yourself as a human guinea pig.

When Stevenson's story was published, it was amid an age where self-experimentation was common, rampant, even. Researchers in all fields concluded that there was no better person to describe the effects of a drug, medical procedure or malady than themselves and conducted experiments as test subjects and scientists.

Today, self-experimentation is abhorred by the scientific establishment. It's dangerous for one and it also makes impossible a hallmark of scientific research, the double blind study, since the experimenter knows there is no control or placebo. But over the centuries, self-experimenting researchers have contributed a great deal to our understanding of the brain, medicine and physiology. This list is an incomplete ode to those people who put science ahead of their own health.

10: Sir Henry Head

Sir Henry Head, a 19th century British neurologist, was intrigued by the concept that people who suffered nerve damage could regain sensation once more. Head wanted to precisely map the road by which sensation returned -- did sensation of hot and cold return prior to response to painful stimuli like pin pricks? However, Head faced a roadblock: The patients he interviewed painted fairly obtuse pictures of their sensations during experiments.

Faced with a less than desirable pool of study participants, Head opted to fully study nociception (pain) by experimenting on himself. "I shall know a great deal about pain by the time this experiment is over," he wrote [source: Watt-Smith]. On April 25, 1903, at the home of a surgeon friend, Head underwent surgery to sever the radial nerve in his left arm (he was right-handed) [source: Voytek]. The radial nerve branches from the spinal column to the fingers and controls both movement, touch and pain sensations in the arm and hand. It's an important nerve -- and Head had his surgically severed. A section was removed and the two remaining ends were tied together with silk to enable regeneration. Three months after his auto-surgery, Head had regained much of his ability to feel pain in his arm.

Over the next five years, Head was subjected to all manner of stimulation to his hand and arm by his co-experimenter, W.H.R. Rivers. Head developed a process he called negative attitude of attention, a sort of meditative state of deep introspection where he focused his attention exclusively on the minute details of his senses. Thanks to Head's early study of nociception, we have a much greater understanding of how the human brain processes different tactile sensations.

9: Friedrich Sertürner

About 100 years before Sir Head had his radial nerve severed, Friedrich Wilhelm Sertürner, a chemist in the German town of Westphalia, became the first to isolate what he thought was the alkaloid that serves as the active ingredient in opium. Following a 52-step process using ammonia to separate the alkaloid, Sertürner isolated crystals he dubbed morphine, after Morpheus, the Greek god of dreams [source: Altman]. Sertürner had good reason to adopt the name for his crystals -- he'd experimented with stray dogs in town and the drug had literally put the dogs to sleep. Minutes later, they went to sleep in a much more permanent fashion.

Despite the death of the dogs that were his first test subjects, the barely twentysomething Sertürner opted to move to human clinical trials, using himself and three 17-year-old friends. The chemist ingested and gave each of his friends a "grain" of morphine equal to about 30 milligrams [source: Cohen]. Sertürner handed out another round of grains 30 minutes later, and followed that by another round 15 minutes after that. In less than an hour, Sertürner and his friends had ingested 90 milligrams of morphine, ten times the recommended limit today [source: Altman].

In short order, it became clear he and his fellow test subjects had overdosed, so Sertürner induced vomiting using vinegar. Everyone lived, but at least one friend spent the night in a deep sleep. The chemist's crystals that he used himself to prove turned out to be the leading pain relief drug used still today.

A 17th-century illustration of Santorio in his famous weighing chair device on which he virtually lived for 30 years.

8: Santorio Santorio

This 16th-century Italian nobleman with a name so nice they used it twice was a literal Renaissance man. Santorio both lived in Renaissance Padua, Italy and divided his interest among a number of pursuits, including physiology. Santorio wondered if what we ingest in the form of food and drink was equal to the amount we expel in the form of feces and urine. A dedicated scientist, Santorio decided he would spend 30 years carrying out a daily experiment to weigh himself, as well as what he ate and what he expelled, and tally up the difference.

To carry out his experiment, Santorio constructed a weighing chair, a four-poster bench dangling from a beam that weighed him and his food and expulsions. Santorio spent almost all his time working, eating, expelling, sleeping and, most importantly, weighing during the course of this 30-year experiment. He found his suspicions were correct: What we ingest weighs more than what we expel, but that difference can't fully be accounted for by the weight we put on by eating and drinking. This led Santorio to his theory of insensible perspiration, or the idea that we expel waste constantly from our skin. While it was groundbreaking, it had no practical application. But Santorio's work led to the study of metabolism, a breakthrough in our understanding of life [source: Minkel].

An early fan and willing experimenter of Albert Hofmann's LSD-25, pictured in New York's Central Park in 1968.

7: Albert Hofmann

One of the most notorious self-experimenting scientists was Albert Hofmann, the Swiss chemist who synthesized LSD-25, the drug that eventually fueled the expanding minds of millions of people in the 1960s and beyond. But there was a time before Hofmann or anyone else knew what LSD was capable of doing to the human mind, and that's when the chemist used himself as a guinea pig for his new compound.

In 1943, Hofmann was a chemist at Sandoz Pharmaceuticals experimenting with synthesizing the active ingredient in ergot, a fungus that grows on grain and contains extremely hallucinogenic properties. He isolated the active ingredient, LSD-25, and while he was handling the preparation, he began to feel sick. He went home, but the effects of the compound were intriguing enough that he approached it once again three days later.

This time, Hofmann measured out 250 micrograms (millionths of a gram) and ingested it. In short order, he once again began to feel bizarre and he left the lab, riding his bicycle home. This bike ride, part of the world's first trip, has come to be commemorated each April 19 as Bicycle Day by LSD adherents [source: NNDB]. At home, Hofmann recorded the effects of the drug he self-experimented with that day. He wrote, "I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors," [source: Tweney].

While it was initially used for several years in psychotherapy and by the CIA as a brainwashing drug, LSD was outlawed in 1967. Hofmann later wrote an autobiography on his drug, called "LSD, My Problem Child."

A botanists' illustration of Deadly Nightshade, one of the many lethal medicinal plants Jan Purkinje ingested in the name of science.

6: Jan Purkinje

A Czechoslovakian monk turned physician in 1819, Jan Purkinje held a great deal of skepticism toward the recommended doses of medicines prescribed by physicians in his day. He considered them far too small and "nothing but mysticism" [source: Altman]. So he set out to determine proper dosages by ingesting the drugs himself, while paying close attention to the effects the drugs had on his mental and physical faculties.

Purkinje tried a number of medicinal plants, like foxglove (digitalis), which slows the heart and is known to blur vision. To study the physiology of vision, he overdosed on foxglove and sketched and described the vision problems he endured. He ingested nightshade (atropine), which stops the heart by overexerting it, to study its effects on vision too. We now use atropine to dilate pupils thanks to Purkinje. And when word got out that this trained physician was experimenting on himself, others asked for his help. One of his teachers gave him extracts of ipecac and asked him to describe his reactions. By the end of the three-week experiment, he conditioned a vomiting response to the sight of any brown powder that looked like the drug.

Over the years, Purkinje self-experimented with nutmeg, camphor, turpentine and a host of other drugs, which led to an increased understanding in dosage and drug interactions.

At a period when many researchers were exploring the now-defunct field of phrenology, Hermann Ebbinghaus created the methodology still used today to explore cognitive function.

5: Hermann Ebbinghaus

Experimental psychology has also received a boon from researchers willing to self-experiment. Perhaps chief among them is German psychologist Hermann Ebbinghaus. He was among the first to apply the rigors of traditional sciences like physics and medicine to the investigation of higher cognitive functions, specifically, human memory. Ebbinghaus provided the methodology and for studying the mind, as well as data, that's still in use by psychologists today.

From 1879 to 1880, Ebbinghaus conducted a self-experiment of his memory by devising a series of 2,300 nonsensical syllables, each consisting of a consonant-vowel-consonant three-letter string, which he committed to memory [source: Abbot]. Ebbinghaus went to the trouble of creating his own set of syllables in order to reduce the chance that he would retain the memory of real syllables using prior association with them. In other words, he may have a fond memory of ice skating raised by the syllable skÄ-, and thus this might give his memory an additional boost that could skew the results.

Ebnbinghaus observed from his first, one-year experiment and a follow-up experiment in 1883, a number of aspects of human memory that we now take for granted today. He concluded that the larger the increase in the amount of material to be learned, the greater the time it takes to learn it; once the material is learned and forgotten, it takes less time to relearn it than it took to originally learn it; and that learning is most effective when the brain has time to absorb the information, a finding which to this day still discounts cramming for exams [source: Plucker].

Dr. Karl Landsteiner, who used his own blood to determine the existence of blood types and won the 1930 Nobel Prize for Medicine for his discovery.

4: Karl Landsteiner

When Austrian physician Karl Landsteiner began investigating blood, science explained the phenomenon where some individual's red blood cells clumped when mixed with other people's blood as the result of some unknown disease or disorder. Landsteiner wasn't convinced and he used his own blood as well as the blood of some colleagues to prove his theory that different people have different types of blood.

Landsteiner used samples to show that humans have different kinds of antigens in the blood. Some of these antigens attacked blood cells harboring other types of antigens. When antigens attack one another, the effect causes clumping of the red blood cells, which in turns leads to the rejection of a blood transfusion and, prior to Landsteiner's research, usually death. In 1901, Landsteiner identified three (and later four) blood types through research on his own blood: A, B, O and AB [source: NobelPrize.org]. Through his self-experimentation, he paved the way for blood type matching that allows for the blood transfusions and organ donation that saves lives today.

Because Dr. Jack Goldstein submitted himself to a blood transfusion of treated type-B blood to prove his research, the available pool of blood donors for type O blood has expanded.

3: Jack Goldstein

In 1981, eighty years after Karl Landsteiner phlebotomized his own blood to prove the existence of blood types, another self-experimenting physician, Dr. Jack Goldstein, furthered the field of blood type. In doing so, he managed to expand the pool of available donors for people with type O blood in need of blood transfusions. This was an important moment in the field; although people with type O blood could give blood to anyone, they could only receive type O blood themselves.

Goldstein discovered that an enzyme found in coffee, alpha-galactosidase, could render the antigens in B-type blood harmless. This chemical reaction effectively transformed B-type blood into what resembled O-type blood. If transfused into O-recipients, it would expand the available donors for B-type as well.

Since Goldstein had type O blood, he underwent a blood transfusion of type B red blood cells that had been treated with the enzyme, rendering it into type O blood. Having received the transfusion without an adverse reaction, Goldstein showed that the technique worked [source: Altman].

On days one through three this would have looked fairly normal to George Stratton during his inversion lens experiment.

2: George Stratton

Every once in a while, a researcher has subjected himself to an experiment that would drive just about anyone else mad. Such was the case with George Stratton, a psychologist at the University of California in the 1890s. The visual information our retinas receive is inverted upside-down; once it reaches the brain, the electrical impulse is inverted again so that we perceive objects in our environment as right-side up. Stratton wanted to find out if theories that suggested inverted information was necessary for us to perceive things as upright were correct.

To find out, Stratton got his hands on a pair of inverted lens, which essentially flip the world upside down when the wearer puts them on. In his first attempt at experimenting, Stratton found two lenses were too much to bear. Instead, he fastened one lens before one eye, blindfolded the other and began an eight-day, mind-bending experiment.

In the paper presenting his findings, Stratton wrote, "All images at first appeared to be inverted; the room and all in it seemed upside down. The hands when stretched out from below into the visual field seemed to enter from above. Yet although these images were clear and definite, they did not at first seem to be real things, like the things we see in normal vision, but they seemed to be misplaced, false, or illusory," [source: Stratton]. On day four, Stratton began to see environment as right-side up once more and after five days he was able to carefully move about his house [source: Cullari]. Stratton had proven that visual information can be presented either way to the brain, which will eventually adapt.

During World War II, Britons were given ration books. They were loaded with coupons for bread, cabbage and potatoes, the triumvirate in Elsie Widdowson's wartime diet plan.

1: Elsie Widdowson

If Santorio Santorio established a proud tradition of self-sacrifice in the field of metabolic research, then perhaps his greatest heir was Dr. Elsie Widdowson. For much of her 60 years of study of nutrition and metabolism, the 20th century British researcher used herself as a willing test subject in her experiments.

In her early career, Widdowson and her long-time collaborator Dr. R.A. McNance combined their research on fruits, vegetables and meats to write "The Chemical Composition of Foods," a landmark book on nutrition still in use today. It was World War II, however, that led Widdowson and her colleague to self-experiment. Because the British government was rationing food, Widdowson decided to determine what healthy diet could be had from the meager and somewhat random assortments of foods that were most widely available to the average Briton during the war. Putting herself on starvation diets, Widdowson produced a diet founded on cabbage, potatoes and bread that could keep a person in good health and submitted it to the British government, which championed it [source: Martin]. To prove their diet, Widdowson and McNance showed it could sustain even the most brutal calorie-burning regimen. While half-starved, the two took to the mountains for grueling workouts, in one day burning 4,700 calories by walking 36 miles (58 kilometers) and climbing 7,000 feet (2.13 kilometers). Keep in mind that the average daily energy expenditure for a female is around 2,200 calories [source: Martin, Smith]. Their diet findings were used to help feed starved Holocaust survivors.

Widdowson also self-experimented with other aspects of diet, including determining salt intake and through self-injecting iron, she found that the mineral is regulated in the body through absorption, not excretion, a finding that forms the basis for treating anemia [source: MRC].

Martin, Douglas. "Elsie Widdowson,93, a pioneer in nutrition." New York Times. June 26, 2000. http://www.nytimes.com/2000/06/26/world/elsie-widdowson-93-a-pioneer-in-nutrition.html?pagewanted=all&src=pm

Sample, Ian. "Who are the hardest, bravest men and women in the history of science?" The Guardian. November 12, 2010. http://www.guardian.co.uk/science/blog/2010/nov/11/hardest-bravest-science

Stratton, George M. "Some preliminary experiments on vision without inversion of the retinal image." International Congress for Psychology. August 1896. http://www.cns.nyu.edu/~nava/courses/psych_and_brain/pdfs/Stratton_1896.pdfs