2 Free Issues

Follow

The Honeymooners

One year in, Obama’s approval ratings have slipped, and they’re likely to get worse. He’ll probably muddle through seven more years of partisan acrimony, small-bore achievements, and bitter disappointment. But this is okay. In fact, it’s the definition of success for a modern president.

The promise and selling point of Barack Obama’s 2008 campaign—breaking with the past, delivering something new—was the oldest promise in American politics. Since European settlers crossed the Atlantic imagining (mistakenly) a “new world” without history, Americans have rewarded talk of new beginnings. The early colonists sought to create a society de novoin ways that Europe—with its religious wars, social stratification, and finitude of land—made impossible. To the Revolutionary generation, the acts of declaring independence and drafting a constitution seemed to ratify this mythology. And in every era since, Americans have fallen, starry-eyed, for leaders who speak of a future unencumbered by history’s weight. Theodore Roosevelt’s New Nationalism, Woodrow Wilson’s New Freedom, FDR’s New Deal, JFK’s New Frontier, even George H. W. Bush’s New World Order—all began with the promise of the new.

Of course, after the flush of a campaign, both voters and presidents have invariably discovered that history imposes constraints. After the Civil War, a cohort of young intellectuals invested hope in Ulysses S. Grant, only to see rampant corruption persist and the dream of reconstructing the South dissolve. After World War I, the crash-and-burn of Wilson’s noble quest for “peace without victory” soured Americans on an energetic executive for a decade. Bill Clinton’s New Covenant, a dead-on-arrival slogan, presaged the letdown that came as his followers realized that liberalism’s revival would require more than a few token compromises.

Obama in 2008 was just the latest aspirant to talk of beginning anew. He bested Hillary Clinton for the Democratic nomination in part by saddling her with the record of not one but two past presidents: the residual regret over her husband’s supposedly small-bore and blandly centrist Third Way agenda, and the collective buyers’ remorse over the Iraq War. In contrast to the dreaded “incrementalism” of the Clintons, Obama’s candidacy tantalized voters with a chance for what he called “transformational” or “fundamental” change.

One year later, transformation looks like a fleeting dream. No one knows whether Obama can deliver massive change on the scale of Lincoln, Wilson, FDR, or LBJ. But right now, the opportunity that loomed last fall seems to have passed. Conservatives—uncharacteristically mute last winter—have regained their voice, nearly derailing Obama’s health-care plan and keeping the administration on defense in the daily media wars. Meanwhile, liberals and leftists, who largely muffled their doubts when Obama had a presidency to win, are suddenly seething over his moderation and compromises—keeping suspected terrorists jailed indefinitely, countenancing his treasury secretary’s coziness with financial CEOs, letting center-right senators weaken his health-care plan. Washington pundits, for their part, intoned throughout 2009 that in taking on health care, energy, and financial reform in his first year, the president was attempting “too much.”

Yet the now-prevalent pessimism about Obama’s presidency is surely unwarranted. True, we can no longer expect Obama to be the agent of a post-partisan politics, or an uncorrupted anti-politician incapable of spin or triangulation, or America’s most civil-libertarian president, or a socialist. But in the modern age, presidents are never able to meet such expectations. Our hunger for presidential intervention, leadership, and salvation now exceeds any individual’s capacities. So the eclipse of these campaign-trail fantasies about Obama’s presidency hardly signals its death. On the contrary, it marks the true beginning.

“If there is anything that history has taught us,” John F. Kennedy said on the campaign trail in 1960, “it is that the great accomplishments of Woodrow Wilson and of Franklin Roosevelt were made in the early days, months, and years of their administrations. That was the time for maximum action.” But Kennedy was wrong—unless you choose to focus exclusively on the word years instead of days and months. As rich in opportunity as presidential honeymoons can be—and the best executives have used them to get important things done—a president’s real work doesn’t occur when he has what Obama calls the righteous wind at his back. It occurs when he has to soldier on into a fight, despite blustery headwinds.

Like the unit of 100 days, the benchmark of a president’s first year matters a lot to journalists but relatively little to historians. The 100-days concept itself, which originated with Roosevelt’s flurry of activity in early 1933, soon devolved into a transparent public-relations gimmick, as media-age presidents sweated over how to boost their grades on what soon came to be recognized as the president’s initial report card. Similarly, the now-ritualized year-one evaluation, though harmless as an exercise in journalistic stock-taking, offers a weak basis for predicting future performance. Indeed, none of the three presidents Obama has taken as his role models—Lincoln, FDR, and Kennedy—enjoyed a first year that foretold the direction of his presidency. Transformation doesn’t happen overnight.

Abraham Lincoln is Obama’s favorite president and his aspirational model. In 2007, the senator from Illinois launched his bid for the Oval Office in Lincoln’s shadow, on the steps of the Springfield Old State Capitol. With his message of national conciliation, Obama often echoed Lincoln’s second inaugural address. Even when he attacked his rivals, he suggested that he was merely combating their retrogressive politics, while he was summoning the better angels of our nature. At times, the Lincoln comparisons taxed credulity: Obama’s devotees even pointed to Lincoln’s one-term service in Congress—and his subsequent rise to become America’s greatest president—to answer the charge that Obama hadn’t accomplished enough in his career to earn him the White House. It was no surprise when, in January 2009, the incoming president took his inaugural oath on the Bible Lincoln had used, and presided over festivities branded as “A New Birth of Freedom.”

Yet as Obama surely knows, Lincoln—a transformative president if there ever was one—started his administration on a shaky note. His inaugural address fumblingly extended an olive branch to the seceding states of the South, promising (to no avail) that he would enforce the fugitive-slave law and uphold slavery in the states where it was legal. The Confederate attack on Fort Sumter forced Lincoln to change course. But on the crucial matter of slavery, the president—who had never considered himself an abolitionist—remained fairly conservative. “If I could save the Union without freeing any slave I would do it, and if I could save it by freeing all the slaves I would do it,” he wrote to Horace Greeley in 1862, “and if I could save it by freeing some and leaving others alone I would also do that.” Few foresaw that his presidency would end with the abolition of slavery and a redefinition of freedom, union, and equality.

Lincoln also needed time to gain his footing as commander in chief. Unsure of himself in military affairs, he was at the mercy of his generals, including the aging and detached Winfield Scott. Dispiriting defeats—notably at the First Battle of Bull Run, in July 1861—emboldened the South. Even after Lincoln mustered the wisdom to replace Scott, George B. McClellan, his new top commander, frustrated the president by declining to advance against Confederate forces. As for his domestic agenda, Lincoln, like most 19th-century presidents, followed Congress’s lead. But even there, despite a Republican leadership eager to exploit the sudden absence of Southerners, major laws—the Homestead Act, the Pacific Railway Act, and the Morrill Land Grant Act—didn’t get the president’s signature until 1862.

No one could say that Franklin Roosevelt began his first year in office hesitantly. His first 100 days were indeed a whirlwind of legislative and executive feats. But FDR geared his first-year efforts almost entirely toward recovery—a necessary but hardly transformative goal.

Certain measures—like solving the banking crisis, which had reached catastrophic proportions on the eve of his inauguration—made a palpable difference. But the core elements of FDR’s “First New Deal” turned out to be, on the whole, ineffectual or unconstitutional—or both. The National Recovery Administration, the centerpiece of it all, which relied on industry leaders to agree to production codes, was flawed in both conception and execution, and it failed miserably. When the Supreme Court unanimously ruled it unconstitutional, Roosevelt’s aide Robert Jackson called the decision a blessing in disguise, since it spared the president from having to watch Congress decline to renew the act. The Agricultural Adjustment Act, which regulated farm production through central planning, was also struck down. And then there was Roosevelt’s Economy Act, a misguided effort in budget balancing taken up before Washington discovered the wisdom of deficit spending.

Most of the New Deal’s lasting elements didn’t come until 1935. Only after taking a beating on the airwaves from demagogic populists like Senator Huey Long of Louisiana and the radio priest Charles Coughlin did FDR sign on to the Social Security Act, which created unemployment insurance, old-age pensions, and a safety net for the disabled. And not until his second term did his administration embrace a Keynesian strategy of aggressive spending to lift the economy out of crisis. If Roosevelt’s first year was historic for its activist spirit and purposeful intervention, its economic philosophy left little mark.

While Obama styled himself Lincolnian in his rhetoric of reconciliation, and Rooseveltian in his steadfastness in the face of economic distress, he just as often summoned the Kennedy mystique, presenting himself as the telegenic, inspirational torchbearer of an ascendant generation. Obama suggested that he wanted to “move the country in a fundamentally different direction,” as he believed Kennedy had. Just as Kennedy’s election shattered the anti-Catholic taboo in presidential politics, Obama’s promised to topple an age-old wall of racial prejudice. The Baby Boomers who flocked to Obama’s candidacy said he brought back memories of JFK. The claim was echoed most tellingly by the fallen president’s own brother, who anointed Obama as JFK’s successor after perceiving a slight to the family name in Hillary Clinton’s assertion that the skill of Lyndon Johnson—she didn’t mention Jack—had been instrumental in passing the 1964 Civil Rights Act.

In fact, on civil rights, as in other areas, Kennedy’s first-year performance dismayed his enthusiasts. As a candidate, he had vowed to desegregate federal housing with “a stroke of the presidential pen.” But once in office, he demurred; fearful of alienating powerful southern Democrats whose support he needed on other issues, he focused instead on foreign-policy problems. Not until he’d cleared the 1962 midterm elections did Kennedy issue the housing proclamation. Caution likewise informed his response to the Freedom Riders—the activists who rode buses across the South starting in May 1961 to force the government to uphold the Supreme Court’s desegregation of interstate travel. When white southerners brutally beat the activists, Kennedy and his aides, unprepared, at first tried to stop the rides, sending in federal marshals only when it seemed that the violence might turn deadly.

In foreign policy, too, the biggest developments of JFK’s debut year yielded little positive transformation. The Bay of Pigs invasion, an ill-conceived CIA scheme hatched under Dwight Eisenhower, redounded to Kennedy’s benefit only because he had the sense not to duck responsibility. At his June summit in Vienna with Nikita Khrushchev, the new president felt he was verbally pummeled by the Soviet premier, in what Kennedy called the “roughest thing in my life.” Kennedy’s tepid response may have encouraged Khrushchev to erect the Berlin Wall that fall. When that happened, too, JFK was slow to act (Kennedy: You can’t stop tanks with words, read one West Berliner’s protest sign), and even his decision to send retired General Lucius Clay and Vice President Johnson to West Berlin to boost morale did nothing to deter the Soviets. At the end of 1961, Kennedy’s aide Ted Sorensen mentioned that two reporters were considering writing books about the year gone by. Kennedy was mystified: “Who would want to read a book on disasters?”

The presidency that Obama’s resembles most so far isn’t any of these but, ironically, that of Bill Clinton—ironic because Obama, speaking in January 2008 about what makes a good president, implicitly denigrated Clinton even as he praised Ronald Reagan for having “changed the trajectory of America” and “put us on a fundamentally different path.” Obama, many speculated at the time, may have been playing head games with his peevish predecessor, goading him into another outburst that would thrill the press pack. Even so, it was a strange reading of history. Reagan’s election, after all, did not initiate but culminated a long conservative effort to gain control of the levers of power; his decisions as president moved his party to the right, but they also introduced fissures and frustrations into the conservative alliance. Clinton’s tenure, in contrast, began a new era for the Democrats, and after his eight years, virtually all of the party’s leading lights embraced what had been controversial stands in 1992: an internationalist foreign policy, a growth-centered economics, and a willingness to link social policies to family values.

The point would be trivial had Obama not reached for Clinton’s 1992 playbook during the fall 2008 campaign. Obama’s battle with John McCain, which centered on the hard-pressed middle class, showed that Obama represented less a repudiation of Clinton (as the primaries had suggested) than a continuation. His rhetoric wafted to earth to focus on everyday economic concerns. His convention speech opened, after the preliminaries, not with soaring visions of post-partisan unity but with issue-based, it’s-the-economy-stupid plain language:

Tonight, more Americans are out of work and more are working harder for less. More of you have lost your homes and even more are watching your home values plummet. More of you have cars you can’t afford to drive, credit-card bills you can’t afford to pay, and tuition that’s beyond your reach.

Obama discovered this idiom just in time for the financial chaos and the debates with McCain.

Obama’s successes and struggles in his first year bear striking resemblances to Clinton’s. Both men were elected with similar mandates—Clinton won 370 electoral votes, Obama 365—and majorities in both houses of Congress. Both opened their first years well by signing a few queued-up executive orders and bills—including the Family and Medical Leave Act, for Clinton, and the Lilly Ledbetter Fair Pay Act and the expansion of the Children’s Health Insurance Program, for Obama. And both made economic revival their first priority. Both men also entered office facing tooth-and-nail resistance from a right wing that had just lost the presidency. The right imagined Clinton, as it does Obama, to be far more radical than he really was, and it thus tried to delegitimize him. A short line connects the “Who shot Vince Foster?” conspiracy theories to those surrounding Obama’s citizenship.

Republicans also forced Clinton to pass his first economic plan without their support, much as they tried to scuttle Obama’s stimulus package. And despite losing the legislative battle, they succeeded in shaping public perception of these economic bills after their passage. Clinton’s 1993 budget—which not only set the government on course for a record surplus, but also cut taxes for millions while raising them on very few—was nonetheless portrayed, and viewed by most Americans, as a tax hike. In parallel fashion, economic evidence suggests that Obama’s spring stimulus bill has already done some appreciable good. But according to an August Gallup poll, Americans consider it too big and are uncertain about its benefits. And while Obama seems likely, as of this writing, to emerge from his first health-care fight with more to show for it than Clinton did from his, the final bill probably won’t be more than an incremental step or two forward—less like Medicare than like the 1996 Kennedy-Kassebaum Act, a now-forgotten consolation prize that Clinton garnered later in his presidency.

The reassertion of political limits and the deflation of campaign-season euphoria make it unlikely that Obama’s presidency will be “transformational” in the sense that he spoke of on the campaign trail—Lincolnian in its boldness, Rooseveltian in its activism, or Kennedyesque in its uplift. More likely, it will resemble Clinton’s presidency, with eight years of muddling through, frequent bouts of sharp partisan opposition, fluctuating poll ratings, and dashed hopes.

This should be no cause for distress. Obama could do worse than to emulate Clinton, who, at the end of the day, left the country better off than when he took office. Clinton’s record remains undervalued, partly because a misleading narrative took hold (that his impeachment cost him the chance to do more), and partly because many of his gains were achieved not through the big-ticket stand-alone legislation that journalists recite in their year-end summaries but through less visible allocations within the interstices of the federal budget. No single law or presidential order gave us the longest economic expansion in history, the lowest unemployment rates in three decades, or the declines in poverty, crime, and teen pregnancy. Nor does Clinton deserve sole credit for these feats. But all were accomplished during his eight years.

Twenty-five years ago, the political scientist Theodore Lowi published a book called The Personal President. It argued that the increasingly large responsibilities placed on the president since Franklin Roosevelt’s time—of regulation, social provision, and economic management, to say nothing of the leadership of the free world—have exploded into impossible expectations. Every postwar chief executive, Lowi noted—and the observation still holds—has begun his presidency with high approval ratings and left office with the public chastened of its early optimism, if not disillusioned altogether. (The president who has exited the White House with the highest approval ratings, post-FDR, is Clinton.)

It is easy to propose that we lower our expectations for our new presidents—even, or perhaps especially, for presidents who come bearing lofty promises of transformation. But we can’t correct the problem, Lowi’s diagnosis suggested, simply by resolving to demand less from our chief executives or by vowing to learn from the past. The problem is rooted in nothing less than the presidency’s assumption of immense powers, and of a central role in our imagination. Candidates have no better path to victory than by inspiring us with dreams of a new political era, and presidents have no choice but to attempt “too much.” In doing so, however, they can only disappoint us.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

The combination of suspicion and reverence that people feel toward the financially successful isn’t unique to the modern era, but reflects a deep ambivalence that goes back to the Roman empire.

In the early 20th century, Dale Carnegie began to travel the United States delivering to audiences a potent message he would refine and eventually publish in his 1936 bestseller, How To Win Friends and Influence People: “About 15 percent of one’s financial success is due to one’s technical knowledge and about 85 percent is due to skill in human engineering—to personality and the ability to lead people.” Carnegie, who based his claim on research done at institutes founded by the industrialist Andrew Carnegie (unrelated), thus enshrined for Americans the notion that leadership was the key to success in business—that profit might be less about engineering things and more about engineering people. Over 30 million copies of Carnegie’s book have been sold since its publication.