"We're not going to need pseudoephedrine," Walter White mutters through clenched teeth. "We're going to make phenylacetone in a tube furnace, then we're going to use reductive amination to yield methamphetamine."

The chemistry lab is the heart of AMC's Breaking Bad. Although this lab may have had scrappy beginnings, it is still a lab, a home of science. And that's fitting: Many illicit drugs today have their origins in formal laboratories, though ones that were much more above board than that of Walter White's.

This points to an interesting and unexplored dichotomy in the history of drugs: There's a huge gap, temporally and culturally, between the inventors of illicit drugs -- usually rather austere, cerebral, and disciplined -- and their consumers, whoever they may be.

An early depiction of cannabis from Jean Vigier's Historia das Plantas (1718), originally published in French in 1670 (The John Carter Brown Library at Brown University)

1. Cannabis
In the fall of 1689, the scientist Robert Hooke ducked into a London coffee shop to purchase the drug from an East Indies merchant and proceeded to test it on an unnamed "friend." It was evidently a large dose. "The Patient understands not, nor remembereth any Thing that he seeth, heareth, or doth," Hooke reported. "Yet he is very merry, and laughs, and sings... and sheweth many odd Tricks." Hooke observed that the drug eased stomach pains, provoked hunger, and could potentially "prove useful in the Treatment of Lunaticks."

Hooke strongly hinted that he'd personally sampled his coffee shop score: The drug "is so well known and experimented by Thousands," he wrote, that "there is no Cause of Fear, tho' possibly there may be of Laughter." (Hooke's readers would have had good reason to be afraid of a new drug: This was a world in which pharmacies sold ground up skulls and Egyptian mummy feet).

"The Beggars" by Pieter Breugel the Elder, a painting believed to show victims of ergotism.

2. LSD
During the time of the Roman empire, physicians described a painful disease called the sacred fire (sacer ignis), which by the Middle Ages came to be known as St. Anthony's Fire -- "an ulcerous Eruption, reddish, or mix'd of pale and red," as one 1714 text put it. Sufferers of this gruesome illness, which could also cause hallucinations, were actually being poisoned by ergot, a fungus that grows on wheat. Several authors, most recently Oliver Sacks in his excellent book Hallucinations, have noted a potential link between ergot poisoning and cases of dancing mania and other forms of mass hysteria in premodern Europe.

By the 1920s, pharmaceutical firms began investigating the compounds in ergot, which showed potential as migraine treatments. A Swiss chemist at the Sandoz Corporation named Albert Hoffman grew especially intrigued, and in November 1938 (the week after Kristallnacht) he synthesized an ergot derivative: lysergic acid diethalyamide, or LSD for short.

It was not until five years later, however, that Hoffman experienced the drug. Immersed in his work, Hoffman accidentally allowed a tiny droplet of LSD to dissolve onto his skin. He thought nothing of it: Hardly any drugs are psychoactive in such minute doses. Later that day, however, Hoffmann went home sick, lay on his couch, and:

sank into a not unpleasant intoxicated-like condition, characterized by an extremely stimulated imagination. In a dreamlike state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors. After some two hours this condition faded away.

Three days later, the chemist decided to self-administer what he assumed was a tiny dose to further test the drug's effects. He took 250 micrograms, roughly 10 times higher than the threshold dose. Within an hour, Hoffman asked his lab assistant to escort him home by bicycle. Cycling through the Swiss countryside, Hoffman was shocked to observe that "everything in my field of vision wavered and was distorted as if seen in a curved mirror."

By the time he arrived home, Hoffman decided to call a doctor. However, the physician reported no abnormal physical symptoms besides dilated pupils, and Hoffmann began to enjoy himself:

Kaleidoscopic, fantastic images surged in on me, alternating, variegated, opening and then closing themselves in circles and spirals, exploding in colored fountains, rearranging and hybridizing themselves in constant flux.

Hoffman awoke the next morning "refreshed, with a clear head," and with "a sensation of well-being and renewed life." In an echo of Hooke's report about his friend's cannabis experience, which left him "Refreshed...and exceeding hungry," Hoffman recalled that "Breakfast tasted delicious and gave me extraordinary pleasure."

An elder statesman of Japanese science and medicine Nagayoshi Nagai. He and his wife hosted Albert Einstein in 1923. (Wikimedia Commons)

3. Meth
A member of the Meiji Japanese elite, Nagayoshi devoted much of his energy to the chemical analysis of traditional Japanese and Chinese medicines using the tools of Western science. In 1885, Nagai isolated the stimulant ephedrine from Ephedra sinica, a plant long used in Ayurvedic and Chinese medicine.

The year before, in July 1884, Sigmund Freud had published his widely-read encomium to the wonders of cocaine, Über Coca. Cocaine was radically more potent than coca leaves, and chemists the world over were on the lookout for other wonder drugs. It's likely that Nagai hoped to work the same magic with ephedra, and in many ways he did. Ephedrine is a mild stimulant, notable nowadays as an ingredient in shady weight-loss supplements and as one of the few drugs that Mormons are allowed to indulge in.

In 1893 Nagai tried something new, using ephedrine to synthesize meth. It took the world a couple decades to catch on. In 1919, a younger protégé of Nagai named Akira Ogata discovered a new method of synthesizing the crystalline form of the new stimulant, giving the world crystal meth.

It wasn't until World War II, however, that meth became widespread as a handy tool for keeping tank and bomber crews awake. By 1942, Adolf Hitler was receiving regular IV injections of meth from his physician, Theodor Morell. Two years later the American pharmaceutical company Abbott Laboratories won FDA approval for meth as a prescription treatment for a host of ills ranging from alcoholism to weight gain.

About the Author

Benjamin Breen is a PhD candidate at the University of Texas at Austin. He is executive editor of The Appendix, a journal of experimental and narrative history,and is writing a book about the history of drugs in the 17th and 18th centuries.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

The president-elect has chosen Andrew Puzder, a vocal critic of minimum-wage hikes and new overtime rules.

Updated on December 9, 2016

President-Elect Donald Trump announced Thursday evening that he picked Andrew Puzder, the CEO of CKE Restaurants, which owns fast-food chains Carl’s Jr. and Hardee’s, to lead the U.S. Department of Labor. Puzder—like several of Trump’s other nominees—is a multi-millionaire and Washington outsider who served as an adviser and fundraiser during the presidential campaign. While there’s no political record to indicate how Puzder thinks about the labor market, his remarks as a business executive give some indication of the stances he’ll take on several important labor issues.

If confirmed, Puzder will likely take a pro-business, anti-labor, approach to steering the federal agency tasked with protecting American workers and their jobs, which clashes with Trump’s populist campaign message of fighting for blue-collar workers. Puzder has been a vocal defender of Trump’s economic policies, including lowering the corporate-tax rate, and has opposed Obamacare and certain business regulations, such as a higher minimum wage. Puzder has argued against raising the minimum wage and offering paid leave and health insurance to employees. Efforts to increase the minimum wage, he writes, will hurt everyone, especially low-skilled workers, because “businesses will have to figure out the best way to deal with the high labor costs.” Those changes, he says, will lead to price increases, more efficient labor management, and automation.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.