Last week, Sen. Bob Graham of Florida pulled out of the Democratic presidential race. It was sad but inevitable. Graham is a good man and a fine public servant, but he can never be president. Only four candidates have a shot next year. They are President Bush, retired Gen. Wesley Clark, former Vermont Gov. Howard Dean, and Sen. John Edwards of North Carolina. The rest are history. Sorry, Dick. Sorry, John. Sorry, Dennis, Joe, Carol, and Al. Turn off the lights behind you.

How do I know? Am I psychic? Mad? Possibly and probably; but in this case I rely on two factors. Following the conventional wisdom, I assume that former Illinois Sen. Carol Moseley Braun, Ohio Rep. Dennis Kucinich, and civil-rights activist Al Sharpton are too marginal to win, though I wish them luck. That leaves Missouri Rep. Dick Gephardt, Massachusetts Sen. John Kerry, and Connecticut Sen. Joe Lieberman. Their problem is different. They've expired.

As every grocer knows, many products have sell-by dates. Bread lasts a day or two, milk maybe a week. Well, presidential aspirants have a sell-by date, too. They last 14 years.

Herewith, Rauch's Rule. Actually, it was pointed out to me by a young political genius named—but I can't tell you his name, because he works in a government job and asked me to keep his name out of my article. Sadly, I must myself take credit for the Law of 14:

With only one exception since the presidency of Theodore Roosevelt, no one has been elected president who took more than 14 years to climb from his first major elective office to election as either president or vice president.

Wait a minute: zero? Right. The rule is a maximum, not a minimum. Generals and other famous personages can go straight to the top. But if a politician first runs for some other major office, the 14-year clock starts ticking.

"Major office" means governorship, Congress, or the mayoralty of a big city: elective posts that, unlike offices such as lieutenant governor or state attorney general, can position their holder as national contender. Bill Clinton became Arkansas attorney general in 1976, but his clock began ticking when he won the governorship two years later. Had he not won the presidency in 1992, his national career would have been over.

Among today's leading Democratic contenders, Lieberman, who in 2004 will be 16 years past his first election to the Senate, is just over the line. Several of the others are way over. Next year, Kerry will be 20 years from winning his Senate seat; Gephardt, 28 years from winning his House seat. Kucinich has been in the House only since 1996, but next year will be the 27th since his national debut as mayor of Cleveland. Graham was a superb candidate on paper, but he has been on the national stage for 25 years, first as governor and then as senator. Yawn.

In contrast, Edwards's clock will have only six years on it in 2004, and Clark's zero. Both candidates could lose next year and have time left for a comeback. Not so for Dean. He was first elected Vermont governor in 1992; if he fails to win national office next year, it's Good night, Howard.

Dean, by the way, succeeded to the governorship in 1991. Note that it is the first election, not the first year in office, that starts the clock, because election demonstrates political viability. Gerald Ford succeeded to the presidency in 1974 without having been elected either president or vice president. When he finally faced the nation's voters in 1976, he was a full 13 years beyond his expiration date. He lost.

I know what you're thinking: The 14-year rule is a fluke. You could always go through a century's worth of presidents and draw some sort of line retrospectively, but that would tell you nothing about the future. Besides, why the tricky-looking allowance for election to the vice presidency?

Actually, finding any political rule that works so well for a whole century is quite hard. And if you worry about the stipulation that 14 years must get a politician to the presidency or the vice presidency, look instead at the presidency on its own. In all but three cases (Johnson, Nixon, and the first Bush), all of the elected presidents since the first Roosevelt made it all the way to the Oval Office in 14 years or less. The clear implication is that Americans like fresh presidents: people with some experience, but not too much.

For some reason, the clock seems to stop during, but not after, vice presidential service. Minus his eight years as Eisenhower's VP, Nixon clocked 14 years to his 1968 presidential run, and he won; minus his four years with Carter, Walter Mondale clocked 16 years to his 1984 presidential run, and he lost.

My guess is that the stature conferred by vice presidential incumbency tends to offset staleness. Incumbent vice presidents get a head start when they run for president. Former vice presidents, however, need to re-establish their viability. Once they leave office, their clock resumes ticking. Had Nixon not won in 1968, we would not have had him to kick around any more.

By way of indirect confirmation, consider that unsuccessful major-party nominees also tend to be fresh faces, though not as reliably as successful nominees. Of 18 failed major-party nominees since 1904 (excluding incumbent presidents), only six were past their 14-year sell date. Fresh candidates are more likely to be nominated, and fresh nominees are more likely to win.

Is it artificial to begin counting with Theodore Roosevelt? I don't think so. Roosevelt was the first modern president, in the sense of winning a national following in his own right rather than being a vehicle chosen by his party. Before him, presidents tended to be either party loyalists with long elective experience, or generals with little or none. Party hacks liked time-servers and white knights. Voters, when they took charge, preferred something in between.

One other objection remains. What if the reason stale candidates don't win is that stale candidates don't run? If the current campaign's expired aspirers are breaking precedent by running, then the past might have little relevance.

No dice. I couldn't check for the whole century (perhaps some ambitious reader can do the spadework), but from 1984 through 2000, nearly half of Democratic and Republican presidential candidates were stale.

For instance, in 2000 I counted 11 Republican presidential aspirants, including several who dropped out early or bolted the party. Five of them had passed their sell-by dates. So had both of the Democratic contenders, namely former New Jersey Sen. Bill Bradley and Vice President Gore.

In the 1996 race to challenge Bill Clinton, six of the Republican contenders were stale—and the other three had never been elected to anything. The choice was between too much experience and too little. Bad move, Republicans. In 1992, four of seven serious Democratic contenders were stale. Luckily for the Democrats, the nod went to Clinton, who was in his 14th year.

In 1988 and 1984, the Democratic crops were fresher, but the point holds. Lots of stale people run for the presidency. They just don't win.

Reader, I crunched a lot of numbers for this article. Probably a few are wrong. If you find some, please write. The Law of 14, having been only recently discovered by an unnamed political genius and even more recently appropriated by me, is in its earliest, least-tested stage. However, the bottom line won't change: Presidential hopefuls have only about 14 years to make it to the White House.

In fact, I can think of only one case besides Johnson's that challenges the rule: that of George W. Bush. True, his clock had only six years on it when he ran for president in 2000. But he did not win the popular vote. The people's choice, albeit by the narrowest of margins, was Gore, who was past his expiration (though only by two years, having taken 16 to reach the vice presidency from Congress). The 14-year rule held, but thanks to the vagaries of the Electoral College and the Supreme Court.

Democrats, do not take comfort. Next year, Bush will still be only 10 years from his first election as governor of Texas. He'll still be fresh.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.