For nearly thirty years Cambridge University's Stephen W. Hawking has been the cosmologist most closely associated in the public mind with the phenomenon of black holes—cosmic concentrations of mass so dense that nothing can escape from them. The basic idea behind black holes has a long pedigree. The English geologist John Michell speculated centuries ago that a celestial body with a radius 500 times greater than that of the sun, but with the same density, would possess an escape velocity at its surface equal to the speed of light, meaning that light could not escape: "all light emitted from such a body would be made to return towards it, by its own proper gravity." Theorizing about black holes picked up considerable momentum after Einstein, and during the past several decades in particular has been the focus of much speculation. It was Stephen Hawking who introduced the idea of the "event horizon" (the outer boundary of a black hole), and it was Hawking who in 1976 proposed that black holes were essentially omnivorous, swallowing not only matter but also information (although it was possible, he thought, that the information might escape into new "baby universes" forming inside the black hole). Hawking's ideas about information loss contradicted known laws of physics. They led to deep divisions among scientists, and a vast amount of science fiction.

Now, after years of fuss, Hawking has conceded error. At the last minute he contacted the organizers of the 17th International Conference on General Relativity and Gravitation, to be held in Dublin. Hawking's message: "I have solved the black hole information paradox and I want to talk about it." When his turn came to speak at the conference, he had this to say: "I am sorry to disappoint science-fiction fans, but if you jump into a black hole, your mass energy will be returned to our universe, but in mangled form. There is no baby universe branching off, as I once thought." Emily Litella, a Gilda Radner character on Saturday Night Live, used to deliver editorial rants against "free Soviet jewelry" and "making Puerto Rico a steak," only to correct herself meekly when informed of her error by a colleague. Stephen Hawking's comments in Dublin amounted to a cosmological "Never mind."

It is always a little disconcerting when audacious scientific theories come a cropper. Sometimes what is lost is not just a specific explanation but a whole way of thinking, an entire world view. Perhaps for that reason, old scientific theories do not wholly die. Oscar Wilde once observed that "science is the record of dead religions." He might have added, in a codicil, that "metaphor is the record of dead science."

The theory of black holes hasn't been discredited in its entirety, just one of its more intriguing postulates. But even if the theory itself were sucked into a black hole, it's hard to believe that the black-hole metaphor—for the bedrooms of certain children, the minds of certain friends, the legal status of certain detainees—wouldn't be around more or less forever. Here's The Boston Globe commenting on memories of the 1960s at Democratic conventions: "They loom stage left, a blur to many of us, a black hole to the rest." Here's the Omaha World Herald on the fortunes of a neighboring state: "While Iowa has been the center of the whirlwind at caucus time, by November the state usually has been sucked into a political black hole."

Theories rejected eons ago as inadequate for the narrow purposes of science have proved far too useful to reject in the broader world of normal life. The idea that a Great Flood once destroyed most life on the planet is now untenable, but we still think of antiquated people and ideas as being antediluvian, "before the deluge." I am aware, and accept, that ships sailing off to the horizon will not actually reach a place where the world ends and the oceans spill off into the void. But regardless of what the geographers say, "falling off the face of the earth" is something that happens—to fashions, to celebrities, to popular culture. It is the explanation for any number of phenomena: What happened to the huge surplus that Gore and Bush sparred over? Where are Erik Estrada and Menudo, Vanilla Ice and Andrew Dice Clay?

I know that technically the conception of Earth as lying at the center of the universe, as proposed by the Egyptian astronomer Ptolemy, is at odds with the established facts. But inadequate as it may be in cosmology, the Ptolemaic metaphor is relevant almost everywhere else. A natural-history exhibit was taken to task a few years ago for "a decidedly Ptolemaic view: the world revolves around New Jersey." Recently I saw an article about people whose lives revolve around their children—"Ptolemaic parenting," this was called. (I'm a committed Copernican myself.)

Reproductive specialists these days are understandably skeptical of the medieval notion that human beings grow to full size from a homunculus, a fully formed but hairless miniature person inhabiting sperm cells. But homunculus the metaphor continues to propagate. Usually it refers to some small, original version of a much larger thing (Wisconsin's welfare program was said to be the homunculus version of the federal Welfare Reform Act of 1996; Iceland's ancient parliament, the Althing, is the homunculus version of Congress), but it can also mean someone who resembles the homunculi depicted in illuminated treatises. I have seen the Oscar statuette referred to as a "gilded homunculus," and Aristotle Onassis as a "leathery homunculus and shipping tycoon." Gollum, Mini-Me, James Carville—in their vastly different ways they all promote homuncular vitality.

Obsolete science survives as metaphor in "philosopher's stone" (used by alchemists to turn base metals into gold) and "spontaneous generation" (the idea that life springs into existence out of nothing, or that things can happen without a cause). Both these concepts come up a lot in, for instance, discussions of economic policy. Phlogiston, the hypothetical element once thought to account for fire, gets pressed into service as a stand-in for any mythical causative substance. (Graham Greene regarded cholesterol as akin to phlogiston.) The Greek physiologist Galen believed that four "humours"—phlegm, black bile, yellow bile, and blood—accounted for all bodily functions and human behavior, an antiquated conceit that no reputable scientist would now endorse. But one still hears references to "the humours," and personally I think that a mere four elements—solipsism, debt, litigation, and hype—could easily explain about 90 percent of human activity.

Psychoanalysis is a nearly boundless category unto itself. It has not yet succumbed totally to the inroads of Prozac and Paxil, but even if drugs come to dominate the clinical future, the concepts of id, ego, superego, Oedipus complex, repression, and the rest are unlikely to atrophy in normal discourse. Superego is conscience, morality, authority—Consumer Reports, the Boy Scout Handbook, Pope John Paul II. The id is the unconscious, the source of instinctual, sometimes shameful, impulses. The actor Jim Carrey was once described as "an insatiable, rampaging id," what's left over "once the layers of civilization have been peeled away." Philip Roth's Portnoy's Complaint has been called "an emblem of the national id." The Super Bowl halftime show, Fear Factor, talk radio—all these things are manifestations of id. An automobile expert had this to say in The Washington Post about how SUV manufacturers balance consumer desires and safety: "There is this fine line that they walk between id and superego."

"Relativity"? "Big Bang"? "Quantum leap"? "Natural selection"? "The uncertainty principle"? These terms are associated with ideas that are very much alive and in good theoretical standing, but who doubts that we'd still employ them metaphorically even if the underlying scientific concepts were shelved? All I know about chaos theory is what Jeff Goldblum explained in Jurassic Park; as actual science the whole concept may be overblown. But as an expression of ordinary social dynamics—the Enron scandal, the Kennedy family, the reconstruction of Iraq—there is clearly something to it.

A few weeks after Hawking's an- nouncement Paul Ginsparg, a professor of physics at Cornell, published a newspaper commentary suggesting that the back-pedaling may have been premature—that Hawking's ideas might not in fact be antediluvian, and could still turn out to be right. As Emily Litella might have observed, the situation is now in a great steak of uncertainty.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.