When Alfred Nobel, the inventor of dynamite, died a bachelor in 1896, the will he left behind instructed that his considerable fortune be used as prize money for young geniuses so that they might continue their life’s work. Little could he have anticipated that the prizes he had set in motion—to be awarded in the categories of physics, chemistry, medicine, literature, and peace—would become so prestigious, or so contentious.

The annual awarding of the Nobel Prizes ignites the sort of controversy that everyone loves to weigh in on. Which worthy contenders were overlooked? Which undeserving candidates were unjustly rewarded?... But such quibbling is to be expected; as a look back at this collection of Atlantic articles on the subject illustrates, when it comes to the Nobel, controversy and debate are the name of the game.

In “Winning the Nobel Prize” (October 1950), Swedish-American author Naboth Hedin considered Alfred Nobel’s original establishment of the prize. The categories that are so familiar to us now, he pointed out, had their roots in Nobel’s personal interests and predilections:

It was second nature for him to think of awarding Prizes in the fields of Physics, Chemistry, and Literature; the fact that he had suffered from poor health all his life and was always on the outlook for a better cure no doubt prompted the Award in Medicine. And it was the Viennese novelist and pacifist Bertha von Suttner, his friend for many years, who probably inspired him to give the Prize for Peace.

Hedin also noted that the Nobel Foundation, set up to carry out Nobel’s vision, had failed to adhere to Nobel’s expressed hope that the prizes be awarded to young men with their careers ahead of them. The prize in literature, for example, is most often awarded “to a man firmly established in Letters, an author whose earnings have already made him independent, and it has come as an accolade to a career which has already reached its peak or passed it.”

When he won the prize for literature in 1930, for example, American novelist Sinclair Lewis was arguably past his prime—already deemed passé by the literati, according to biographer Mark Schorer in “Sinclair Lewis and the Nobel Prize” (October 1961): “The mood that Lewis had briefly exemplified more emphatically than anyone else was over,” Schorer wrote, “and Lewis was generally thought of as finished.” Even Lewis himself was taken aback to the point of disbelief by his receipt of the prize. Schorer described what happened when Lewis answered the telephone to hear a Swedish voice informing him he had won literature’s most prestigious award:

The voice was that of a Swedish newspaper correspondent in New York who had managed to track down Lewis for the Swedish Embassy, but Lewis thought that it was the voice of his friend Ferd Reyher, who liked to do imitations and play jokes. “Oh, yeah?” he replied. “You don’t say! Listen, Ferd, I can say that better than you. Your Swedish accent’s no good. I’ll repeat it to you.” And he repeated it, “You haf de Nobel Brize,” and more. The bewildered Swede protested in vain and finally called an American to the telephone to confirm the news. Lewis fell into a chair.

Though the prize may not have come to Lewis at the upswing of his career, it did indirectly end up serving Nobel’s goal of freeing a worthy author from financial hardship. Answering reporters’ questions about what he would do with the prize money, Lewis said he would “use it to support a well-known young American author and his family, and to enable him to continue writing.” (This response was interpreted by some foreign media, Schorer explains, “to mean that he was going to give the money away to some worthy young writing fellow, and hailed as an act of extraordinary magnanimity.”)

As scientist-turned-novelist Mitchell Wilson illustrated in “How Nobel Prizewinners Get That Way” (December 1969), Nobel Prizewinners’ reactions to receipt of the award tend to differ widely. “My God! What happens now to the rest of my life?” Chinese American physicist T. D. Lee exclaimed in 1957 after receiving the news that he had been awarded that year’s prize in physics. By contrast, Wilson described the more indifferent reaction of Physicist Maria Goeppert Mayer, who had won the prize for physics in 1963:

“To my surprise, winning the prize wasn’t half as exciting as doing the work itself,” she said to me with some perplexity. “That was the fun—seeing it work out!” Even the memory of the lack of elation seemed to sadden her; yet her achievement was all the more remarkable because she had done her work when she was well into her forties and she had only recently come into the field of physics from chemistry, and most of all because she was a woman.

J. H. D. Jensen, with whom Mayer shared the 1963 prize, reacted with similar nonchalance, telling Wilson, “By the time it came, it didn’t really matter very much. The big moment for me had come years before when I learned that [1938 Nobel prizewinner Enrico] Fermi had put my name in nomination. I didn’t get it that year, but I didn’t really care. It was Fermi’s regard that was the ultimate honor for me, not the medal.”

Wilson explained that despite the prize’s prestige—or perhaps because of it—its effect on laureates is not necessarily one of increased productivity or creativity. Some have speculated that the falloff in creativity might result from increased demands on the newly minted celebrity’s time, or from the laureate’s realization that he or she no longer has to work. But Wilson rejected those two hypotheses, arguing that prizewinners are dedicated to their work above all else. Instead, he distinguished between two groups of prizewinners: those, such as Albert Einstein, Ernest Rutherford, and Enrico Fermi, who are brilliant enough to produce breakthrough after breakthrough, and those, such as Wilhelm Konrad Roentgen, who are only capable of producing one breakthrough in their lifetime. Roentgen's research in X-rays earned him the first-ever Nobel Prize in Physics, but afterwards he failed to make any further significant contributions to science:

Men like Einstein, Rutherford, Fermi, and other giants, who are bigger than the prize, can win it at any time of their lives, take it in their stride, and go on continuing to be fruitful; while Roentgen and others like him who are smaller than the prize are overwhelmed by it—a heavy crown is only for very strong kings.

As for Roentgen’s 1901 prize in physics, it did not go undisputed; he was accused of taking credit for one of his students’ work. “He was the first, but certainly not the last, Nobelist to become involved in an ugly struggle for credit,” Wilson noted.

Indeed, the controversies continued. Harvard historian Donald Fleming began his piece, “Nobel’s Hits and Misses” (October 1966) by surveying some glaring omissions throughout Nobel Prize history and enumerating the systemic flaws from which those poor choices had stemmed. He argued that the rules set forth in Alfred Nobel’s handwritten will, along with other restrictions added later, had diminished the quality and restrained the scope of the prizes.

Three of the limitations were imposed by Nobel himself: that literature had to be ‘idealistic’ to qualify; that science meant a discovery, invention, or improvement, with the narrow definition of ‘discovery’ implied by his coupling it with the other terms; and that all prizes should be for the work of the preceding year.

He expanded upon this last criticism of the prizes in science by outlining an additional difficulty faced in that category: prizes were to be awarded only to individuals, yet many significant breakthroughs in science are driven by collaborative, incremental work. Important medical advances, such as the discovery of sex hormones and vitamin D, he noted, had been overlooked because they resulted from the work of too many scientists.

He made special mention of a few of the Nobel Prize selection committee’s “outright blunders.” The “bellicose” Theodore Roosevelt, he pointed out, had been an odd choice for the 1906 peace prize. Rudolf Eucken, he noted, is “a deservedly forgotten philosopher who was never important,” yet he was awarded the 1908 prize for literature. And J. J. R. Macleod, who had shared the 1923 prize in medicine for "the discovery of insulin," had in fact had very little to do with insulin's discovery. His winning of the prize, Fleming explained, had “arisen out of sheer ignorance of the facts”—Macleod had merely provided laboratory space and some general guidance to others. As for the choices for prizes in literature, Fleming described their record as “inexcusably bad”:

In addition to most of the giants of world literature, the non-winners have included Anna Akhmatova, Aleksandr Blok, Karel Capek, Jaroslav Hasek (of The Good Soldier Schweik), Stefan George, Arthur Schnitzler, Hugo von Hofmannsthal, Robert Musil, Paul Claudel, André Malraux, Miguel de Unamuno, Ortega y Gasset, Italo Svevo, George Meredith, H. G. Wells, Katherine Mansfield, E. M. Forster, Virginia Woolf, Dylan Thomas, William James, Theodore Dreiser (the runner-up to Sinclair Lewis in 1930), Edith Wharton, Scott Fitzgerald, Ezra Pound, and Robert Frost. Prizes for Swinburne and Paul Valéry were in the making when they died—not exactly prematurely: they were both in their seventies.

One of Fleming’s primary criticisms, shared by Hedin, was that the Nobel Prizes had failed to serve the original purpose of nurturing fledgling careers. Again, he turned to the prizes in literature to illustrate this point:

With the clear exception of Yeats and the possible exceptions of O’Neill, Camus, and Sartre, no author has been caught while his career was still on the upswing: the average age of the winners has been over sixty. Only seven men, including Kipling and Camus, have been recognized in their forties. It is easy to see how this dismal record came about. The Academy wanted to be sure about the winners’ ultimate stature. But this is quite simply a violation of Nobel’s entire purpose. He wanted to recognize the most impressive recent book, not to set the seal upon the work of a lifetime or to reward the capacity for literary and physical endurance.

Finally, in “Nobel Sentiments” (March 2002), P.J. O’Rourke joined the chorus of criticism, in the context of mocking an idealistic—and in his view ridiculous—joint letter that had been penned by 103 Nobel laureates. “Making fun is especially tempting,” he wrote, “to those of us who will receive invitations to Stockholm only in the form of brochures from Scandinavian cruise-ship lines. Let me give in to temptation.” He, too, had a list of glaring omissions:

Ernest Hemingway but not James Joyce? Toni Morrison but not John Updike? Dario Fo? Selma Ottilia Lovisa Lagerlöf? (She wrote The Wonderful Adventures of Nils, a fanciful account of a young boy’s travels across Sweden on the back of a goose.) And allow me to be the millionth person to point out that among the Nobel Peace Prize winners are Yasir Arafat, Shimon Peres, Henry Kissinger, Le Duc Tho, and International Physicians for the Prevention of Nuclear War. (“If the mushroom cloud doesn’t clear up, call me in the morning.”) For all I know, the lists of prizewinners in physics, chemistry, medicine, and economics are just as wack. I’m not competent to judge. Although the Cambridge University professor Brian Josephson (Physics 1973) says, “There is a lot of evidence to support the existence of telepathy.” And a co-discoverer of DNA, James Watson (Medicine 1962), is, at age seventy-three, researching the effects of sunshine on sex drive.

Despite the perennial nay-saying and controversy, the prizes still retain their prestige and authority. Perhaps it is a testimony to the relative insignificance of the prizes’ misfires—or simply to public ignorance of them—that, as Fleming writes, “the luster of the Nobel Prizes has remained absolutely undimmed as the most glittering recognition of intellect that can come to a man or woman of the twentieth century.”

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.