In 1944, Harry Truman was asked by his friend and Senate colleague Owen Brewster what Franklin Roosevelt was really like. Truman hadn’t gotten to know his running mate very well, but the Democratic vice-presidential nominee had spent enough time around FDR to provide a succinct answer.

“He lies,” Truman replied.

At that point, the most consequential issue the president was untruthful about was his health. Roosevelt was failing rapidly, as his physicians knew, and as those around the White House could not help but notice. With the Allies opening a second front in Europe and island-hopping across the Pacific, the commander in chief was working at most four hours a day, and sometimes as little as one or two. On March 28 that year, Dr. Howard G. Bruenn, Roosevelt’s cardiologist, had given his diagnosis: hypertension, hypertensive heart disease, cardiac failure, and acute bronchitis. The president’s condition, Bruenn later explained to Jan Kenneth Herman, editor of Navy Medicine, was “god-awful.”

The American public, and the world, received a far different image: that of a jaunty, robust president preparing to crush Hitler and the Japanese empire while cruising to an unprecedented fourth term. Truman quickly became complicit in this deception. The Missouri senator—considered then and now a straight shooter—went to lunch at the White House on August 18 and told reporters afterward that Roosevelt “looked fine and ate a bigger lunch than I did.”

But Truman provided a different account in the privacy of his Senate office. “I had no idea he was in such feeble condition,” he confided to his military aide Harry Vaughan, noting that when the president poured cream into his coffee, more went into the saucer than into the cup. Winston Churchill had seen with his own eyes evidence of FDR’s physical decline the year before, in 1943, but raised no objection to the American administration’s deception—Roosevelt was too important to the war effort. That same year, at a conference in Tehran in which the Allies discussed opening new fronts against Nazi Germany, Churchill stressed the need to keep the Allies’ plans secret. To Joseph Stalin, he said, “In wartime, truth is so precious that she should always be attended by a bodyguard of lies.”

“Churchill’s line is par for the course in wartime, when you have to keep your secrets to yourself,” Sean Wilentz, a professor of history at Princeton, told me this summer when we discussed the morality, and the utility, of shading the truth in the White House. “Presidents lie for all kinds of reasons,” he added. “Richard Nixon lied because he was trying to save his presidency, which was imperiled by his misdeeds. Franklin Delano Roosevelt misled the country over things like Lend-Lease in order to advance a policy he thought would save the world, but which he knew would be difficult to sell politically. Honesty doesn’t necessarily make for an effective presidency … What the public has to judge is whether [presidents] are lying for the good of the country—or for their own good.”

Wilentz’s interest in this question is not entirely academic. In December 1998, he spoke passionately before the House Judiciary Committee against the articles of impeachment leveled against President Clinton, the most serious of which was lying under oath. Wilentz argued that Clinton’s transgressions, stemming as they did from personal—not official—conduct, simply were not what the Founders envisioned when they gave Congress the remedy of impeachment. “As a historian,” he told Congress, “it is clear to me the impeachment of President Clinton would do greater damage … to those institutions and to the rule of law, much greater damage, than the crimes of which President Clinton has been accused.” Clinton initially staked his political survival on proving his veracity. When that was found wanting, his advocates successfully argued that perjury was too grave a charge to apply to purely private behavior.

In his second term, George W. Bush faces a credibility crisis of his own. And while the current president may not share his predecessor’s earthy predilections, he also doesn’t have the same excuse, a wish to protect his private life. Bush is accused of equivocating not about his personal life but about one of the most fundamental public-policy decisions a democracy can confront: the decision to take the nation to war.

For the past year and a half, a majority of Americans have expressed doubts that the president’s reasons for ordering the military to invade Iraq were those he articulated publicly. An April 2005 Gallup Poll found that a majority of Americans believed Bush “deliberately misled the American public” about whether Iraq possessed weapons of mass destruction. In May 2006, 46 percent of respondents in an ABC News/Washington Post survey said they believed that the Bush administration had said “what it believed to be true” in making its case for Iraq, while 52 percent said that it had “intentionally misled the American public.”

Bush loyalists have blamed the media for such perceptions, but even allowing for a heavy dose of anti-Bush feeling on the part of the networks, news outlets, and publishing houses, much of the American public has altered its opinion regarding the veracity of the man in the Oval Office. Bush’s place in history, however, will depend not on whether he lied to the American people—every president, arguably, has succumbed to that temptation—but how he lied, what consequences his lying unleashed, and how he ultimately responded to them. Put bluntly, posterity will judge the current president not so much by whether he told the truth but by whether he recognized what the truth actually was.

Why do presidents lie? Do they lie more than most people? Are lies of omission essentially the same as lies of commission? What about presidents who convince themselves of things that are untrue—who are, we would say, “in denial”? Is this tantamount to lying? Can presidents be truly effective without lying—or are there times when they simply must engage in deception? If so, when? And how is the public to know whether presidents are abusing that prerogative?

The first question might be the easiest to answer: presidents lie because they are human.

“Everybody lies,” says Charles Ford, a professor at the University of Alabama at Birmingham and the author of a book about the psychology of deceit. “It is part of human nature, ubiquitous in the animal kingdom. However, some people lie compulsively, often when the truth would serve them better.”

Admonitions against lying are as old as Western civilization itself, but the Ninth Commandment was applied to the presidency by the first presidential biographer—a parson named Mason Locke Weems, who not only launched the cult of the president-as-truth-teller but did so retroactively with that famous, but unverifiable, cherry-tree story. Ever since, historical revisionism notwithstanding, American schoolchildren have been raised on the standard of a U.S. president who didn’t lie—couldn’t lie—even as a six-year-old boy. Abraham Lincoln was said to have walked miles as an Illinois store clerk to return a few cents’ change. His “Honest Abe” nickname, which predated his presidency, was an advantage that his opponent Stephen Douglas tried to erase by calling him “two-faced.” (Lincoln’s response: “I leave it to [my audience]. If I had another face, do you think I’d wear this one?”) Mark Twain deadpanned that Americans held their presidents to a standard few mortals could meet. “I am different from [George] Washington,” he would say. “I have a higher and grander standard of principle. Washington could not lie. I can lie, but I won’t.”

Presidents prevaricate for the reasons other people do: pathology, politeness, paternalism, convenience, shame, self-promotion, insecurity, ego, narcissism, and even, on occasion, to further a noble goal. Presidents also have burdens not felt by most of us—keeping the nation safe, for one. High-level statecraft requires a talent for telling divergent groups of people what they want to hear. This is not the best recipe for truth telling, particularly in times of war or national peril.

All lies, unlike all men, are not created equal. Philosophers from Aristotle to Niebuhr have made moral distinctions among falsehoods, whether “white lies” told for social convenience or to spare feelings, “excuses” that are only half true but that rationalize our own behavior, lies told during a crisis, lies told to liars, paternalistic lies told to protect those we care about, and lies told for the social good—also known as “noble lies.” The presidential scholar Richard Norton Smith points out that Thomas Jefferson’s own interpretation of the Constitution’s limits on presidential power probably didn’t allow for the Louisiana Purchase. Yet in office, having sworn to uphold that Constitution, Jefferson couldn’t resist stretching the words and doubling America’s size for a few million dollars. “And talk about a turnaround—it was Nixon who went to China,” Smith added. “It’s often the flip-flop, in pursuit of interests that transcend ideological consistency, that puts a president on Mount Rushmore.”

Nor does the innocent social lie told by presidents—or on a president’s behalf—usually get a president in trouble. Asked for President Reagan’s reaction after winning a hard-fought 1981 vote in Congress authorizing the sale of AWACS planes to Saudi Arabia, the White House aide Michael Deaver told reporters the president exclaimed, “Thank God!” What Reagan actually said, according to someone in the room, was, “I feel like I’ve just crapped a pineapple.”

The first cousin of the white lie is the idle boast—not quite so harmless, but not nefarious, either. The unkindest way of looking at the bio-lie is to say that presidents (and presidential candidates) tend to be braggarts. More charitably, one could conclude that the job description seems to demand some fiddling with one’s pedigree or accomplishments as a way to bond with voters—to tell them, in effect, “I’m one of you.” In 1840, William Henry Harrison campaigned as a rustic born in a log cabin and indulged crowds with Indian war whoops at his rallies. In reality, the Whig nominee was the scion of an elite colonial family (his father was a signer of the Declaration of Independence and a three-term governor of Virginia); a professional soldier, Harrison read the classics and enjoyed fine living.

That kind of personal embellishment was also at work when John F. Kennedy, courting the elites as well as the masses, told Time magazine’s Hugh Sidey that he could read 1,200 words a minute (a figure JFK pulled out of the air); when Lyndon Johnson exclaimed to U.S. troops in Korea that his great-great-grandfather “died at the Alamo” (a great-great-uncle fought at San Jacinto, but wasn’t killed); when Bill Clinton claimed he’d heard about the Iowa caucuses “since I was a little boy” (they didn’t begin until he was in graduate school); and when Al Gore told a labor crowd that his mother used to lull him to sleep when he was a baby with “Look for the Union Label” (a ditty written in 1975, when Gore was twenty-seven years old). One bizarre whopper: Ronald Reagan told Israeli Prime Minister Yitzhak Shamir and the Nazi hunter Simon Wiesenthal, in separate Oval Office visits, that as a young soldier in the U.S. Army Signal Corps during World War II, he had filmed the liberation of Nazi death camps; Reagan never served in Europe at all, though his work involved handling footage shot by military cameramen and war correspondents. Covering the White House for the past dozen years, I’ve become something of a connoisseur of the presidential boast. My favorite was when Clinton told The Des Moines Register editorial board that he was the only president who knew anything about agriculture before coming to office—skipping over actual farmers like Washington, Jefferson, Truman, and Jimmy Carter, as well as the Iowa farm boy Herbert Hoover.

George W. Bush does this sort of thing, too. During a January 2002 visit to West Virginia, the president kibitzed with Bob Kiss, the Democratic speaker of that state’s legislature, over something they had in common: twins. “I’ve been to war,” Bush said. “I’ve raised twins. If I had a choice, I’d rather go to war.” It was a funny line, except that Bush did have a choice to serve in a war—in Vietnam—and didn’t.

As candidates seek to woo voters with promises of good things to come, truth is often the first thing jettisoned on the campaign trail. This phenomenon is not new. In the closing days of the 1932 campaign, Franklin Roosevelt promised a crowd in Pittsburgh that he’d balance the federal budget while cutting “government operations” by 25 percent. Wisely, he attempted neither, but four years later as he prepared for another campaign trip to western Pennsylvania, he asked his speechwriter Sam Rosenman what he should say if his earlier vow came up. “Deny you were ever in Pittsburgh,” Rosenman replied.

Rosenman’s quip is still funny. One could, of course, huffily demand absolute fealty to the truth, but one could also live in the real world and find comforting reassurance that neither a president nor his wordsmiths believe all their own public relations. We still should make distinctions. Some falsehoods—like many campaign lies—are relatively harmless; they may soil an opponent’s résumé or polish one’s own, but their consequences are slight. Deceptions to promote or protect a policy or presidential action—call them governing lies—are more consequential, and it is by their consequences that they should be judged, as the American public harshly judged the lies told about the Vietnam War and about Watergate. Over the course of a tumultuous decade, those consequences included not just the ignominious end of an unpopular war and the fall of a president but a profound change in how much deceit the public—and the media—would tolerate from the Oval Office.

Even before those twin traumas came to a head, the American public’s trust in its leaders had frayed. David Wise, a former White House correspondent turned investigative reporter, tapped into widespread public disgust in his catalog of presidential untruths, The Politics of Lying, published in 1973. “By 1972 the politics of lying had changed the politics of America,” he wrote. “In place of trust, there was widespread mistrust; in place of confidence, there was disbelief and doubt in the system and its leaders.” Still to come were Watergate and the release of White House tapes on which Lyndon Johnson blurts out to Robert McNamara that he knows that the reason used to justify the massive buildup of troops in Vietnam—the supposed attack on U.S. Navy ships by North Vietnamese patrol boats in the Gulf of Tonkin—was fiction. By 1975, the year Saigon fell, 69 percent of Americans answered affirmatively to a poll question asking whether “over the last ten years this country’s leaders have consistently lied to the people.”

That widespread erosion of trust also prompted Sissela Bok, the daughter of the famed Nobel laureates Gunnar and Alva Myrdal and a professor, at the time, of philosophy and lecturer at Harvard Medical School, to write Lying: Moral Choice in Public and Private Life—a touchstone, since its 1978 publication, for those seeking to examine the morality and the social costs of lying. Bok argued that while there are rare occasions when a lie may be justified, these falsehoods (which range from harmless social lies to extreme scenarios like telling a would-be murderer you don’t know where to find an intended victim) had contributed to a general disregard for truth telling. Bok’s book, which sprang out of her own research on the ethics of administering medical placebos, did not focus on lying politicians, though they pop up here and there. But her deep concern about the decline of trust, and her call for political, corporate, and educational institutions to take the lead in demanding and rewarding truthfulness, resonated with the public.

Presidential candidates, even those with reputations for evasiveness, reacted to the growing focus on honesty the way you might expect: they accused their opponents of lying, while promising not to lie themselves. Richard Nixon had been elected president in 1968 after positioning himself as the peace candidate. As the phrase credibility gap gained currency in the context of Johnson’s lies about the Vietnam War, the incoming White House communications director, Herbert Klein, vowed, “Truth will become the hallmark of the Nixon administration.” But by the time Nixon left the White House,a new catchphrase had entered the lexicon: What did the president know, and when did he know it? Upon being sworn in to succeed Nixon, Gerald Ford pronounced truth “the glue that holds government together.” Jimmy Carter went Ford one better, in his unequivocal promise to the American people: “I will never tell a lie. I will never make a misleading statement. I will never betray the confidence any of you has in me.”

By many measures, the Carter administration was more open, transparent, and truthful than many of its predecessors—and successors as well. But Carter’s monastic vow of absolute truthfulness also generated its own blowback, most memorably Steven Brill’s stinging piece “Jimmy Carter’s Pathetic Lies.” One example: “If you ever have any questions or advice for me,” Carter told audiences, “just put Jimmy Carter, Plains, Georgia, on the envelope … I open every letter myself, and read them all.” This was an impossibility: the mail was forwarded, as it had to be, to Carter’s campaign headquarters in Atlanta.

Such campaign exaggerations aside, Carter’s commitment to truth telling did not wear well. “Humankind cannot bear very much reality,” wrote T. S. Eliot, and after four years of Carter, the American electorate was no exception. Carter’s revelatory form of communication (admitting to Playboy that he “looked on a lot of women with lust,” for example, or bemoaning a national “crisis of confidence”) was a poor substitute in voters’ minds for executive-branch competence, or for leadership that could make Americans feel good about themselves. As Western Illinois University history professor George Hopkins sees it, all presidents lie for the simple reason that if they didn’t, we wouldn’t elect them. “So the problem is not them, it’s us,” Hopkins told me recently. “We should look in the mirror.”

Thus, Carter was involuntarily retired by Reagan, who berated Carter for distorting his record but had a tendency himself to stretch the truth if it made for a good yarn. Witness a favorite Reagan story about his role in a football game in high school in which, he claimed, players for a rival school, Mendota, complained to the referees that Reagan, playing for Dixon High, had committed a penalty that was not called. The refs supposedly asked him about it. “I told the truth,” Reagan later said. “The penalty was ruled, and Dixon lost the game.” My father, the Reagan biographer Lou Cannon, investigated this claim. He discovered that there were no contemporaneous accounts of any such incident, and that Dixon lost to Mendota only once when Reagan was a member of the varsity team—by a score of 24-0. “The ironic point here is that Reagan seems to have told the story to demonstrate how truthful he was,” notes George Mason University political scientist James Pfiffner, who has studied presidential lying. “Yet he was telling an untruth to make the point.”

More infamously, in November 1986, Reagan told the American people that his administration had not traded weapons “or anything else” to Iran in return for American hostages captured in Lebanon. Three weeks later, in a radio address, the president softened this to “Let me just say it was not my intent to do business with [Ayatollah Ruhollah] Khomeini, to trade weapons for hostages.” Three months after that, in an Oval Office address, Reagan confessed: “A few months ago I told the American people I did not trade arms for hostages. My heart and my best intentions still tell me that’s true, but the facts and evidence tell me it is not.” Reagan’s presidency was winding down, but the not-yet-begun presidency of George H. W. Bush was already marred by his insistence that he was “out of the loop” on the Iran-Contra arms-for-hostages scandal. Special prosecutor Lawrence Walsh spent all four years of the first Bush presidency examining that alibi. Concluding that it was bogus, Walsh released documents three days before the 1992 election showing that Bush had attended crucial Iran- Contra meetings and approved the plan.

Bill Clinton’s mendacity as president—or, depending on your politics, independent counsel Kenneth Starr’s perjury trap—was the backdrop for the campaign in 2000 to find a new president. Once again, the public was looking for a relatively honest politician. As Sissela Bok wrote in the preface to an updated edition of Lying, released in 1999, “No matter how our own period comes to be judged … what is already certain is that we are all on the receiving end of a great many more lies than in the past.” John McCain dubbed his campaign bus the “Straight Talk Express” and ended rallies by proclaiming that, as president, he would “tell the American people the truth—even if it’s bad news.” In the general election, George W. Bush stressed this theme, too. In his third debate with Al Gore, Bush said the country needed “somebody in office who will tell the truth.” This was not a casual observation; it was a premeditated talking point for the Bush-Cheney campaign that night—and for the rest of the month of October.

By then, the story line of that campaign was framed as Al Gore’s Bill Clinton–style exaggerations versus George W. Bush’s Dan Quayle–style bloopers. Afterward, Bush assumed office with a reputation as a truth teller, even among Americans who didn’t support his policies—or didn’t think he was all that bright. Two months before 9/11, in an Opinion Dynamics poll, 69 percent of Americans—21 percent more than had voted for him—responded that they found Bush, who had campaigned on conservative themes, to be “honest and trustworthy.” Only 20 percent of respondents disagreed with that sentiment.

How did Bush go in the public eye from truth teller to prevaricator in chief? He has now been pilloried, after all, not just by outspoken liberals—in books like David Corn’s The Lies of George W. Bush and Al Franken’s Lies and the Lying Liars Who Tell Them, in documentaries like Eugene Jarecki’s Why We Fight and Michael Moore’s Fahrenheit 9/11, and on countless Web sites—but also by an increasingly hostile mainstream media, including top national newspapers, The New Yorker, and this magazine, as well as high-profile books on the Iraq War like Thomas E. Ricks’s Fiasco, Peter W. Galbraith’s The End of Iraq, and State of Denial, the third volume in Bob Woodward’s trilogy about this administration.

The critique that Bush isn’t honest with the American people confounds many of those who have worked most closely with him. That includes two aides whose opinions and integrity I greatly respect: Michael J. Gerson, Bush’s former chief speechwriter, who describes the president as a “compulsive truth teller,” a man so guileless that he can’t hide his boredom when making speeches he doesn’t want to give; and Peter Wehner, director of the White House Office of Strategic Initiatives, who argues that the notion Bush “lied” about the presence of weapons of mass destruction in Iraq is absurd on its face—and that those accusing Bush of dishonesty are the ones deeply mistaken.

Yet Bush has gradually built a record of partial truths, half-truths, and untruths. Several cases in point:

■ In his 2006 State of the Union address, Bush cited Iraq and Afghanistan as examples of “the great story of our time”—the advance of freedom. He proclaimed that the number of democracies in the world had increased from about two dozen at the end of 1945 to 122 today, but he didn’t mention that neither Iraq nor Afghanistan was counted as such by the organization whose statistics he was touting.

■ The president also asserted, accurately, that the U.S. economy had gained 4.6 million new jobs in the previous two and a half years, but he failed to note that it had lost 2.6 million jobs in his first two and a half years.

■ In March 2003, Bush insisted that it was “a matter of fact” that the coalition he cobbled together for the invasion of Iraq included more nations than the alliance assembled by his father in 1991. In Fiasco, Ricks deconstructs this argument by pointing out that most of George W. Bush’s partnering nations (with the notable exception of the British) were a reluctant bunch. The Poles fought, but resented being there. The Italians wouldn’t get out of their vehicles on patrols. The Japanese wouldn’t patrol at all and, in fact, wouldn’t even guard their own perimeters—Dutch troops did it for them.

■ In June 2004, when asked about Ahmad Chalabi, the Iraqi exile who’d done so much to encourage the U.S. invasion of Iraq, but who had fallen out of favor with U.S. military leaders, Bush acted as if he barely knew the man’s name. “Chalabi? My meetings with him were very brief,” Bush said. “I think I met with him at the State of the Union and just kind of working through the rope line, and he might have come with a group of leaders. But I haven’t had any extensive conversations with him.” Perhaps. But Chalabi wasn’t confined behind a rope line at the 2004 State of the Union address. He was listed by the White House as a “special guest” of first lady Laura Bush and seated directly behind her.

■ After the Democrats’ victories in the 2006 midterm elections, the president allowed that “Democrats are going to support our troops just like Republicans will” and that the Democratic congressional leaders Nancy Pelosi and Harry Reid “care about the security of this country, like I do.” Those gracious statements were at odds with his campaign rhetoric from just days before, when he had said, regarding Iraq, that if the Democrats’ vision were to prevail, “the terrorists win and America loses.” On November 8 at the White House, Bush suggested that it was the campaign talk that was disingenuous. But maybe it was the other way around—that Bush meant what he said in Texas, and was only being politic in the East Room.

■ At the same press conference, Bush essentially admitted he’d lied to three White House correspondents who had asked him in an Oval Office interview the week before whether Defense Secretary Donald Rumsfeld was staying on. The president had assured the three reporters that Rumsfeld was remaining. Now, standing in the East Room, Bush was revealing the details of a different reality: he’d decided before the election to sack his Pentagon chief, and when asked the question, he was already focused on Rumsfeld’s likely replacement. Bush provided dueling explanations: First, he maintained that he didn’t “want to inject a major decision about this war” into the waning days of a campaign. Then he immediately added the more Clintonesque explanation that his answer hadn’t really been dishonest, because he hadn’t yet had his “final” conversation with Rumsfeld, and hadn’t interviewed Robert Gates in person.

Presidents have rarely told the full truth in the midst of major military operations, and until Vietnam, Americans tended to cut them slack for the sake of the troops, if nothing else. During World War II, for example, the government launched an elaborate disinformation campaign to mask the details of D-Day—an episode that Sissela Bok cites (as a prime example of a lie commonly thought justifiable and one that was the precise context of Churchill’s line to Stalin about protecting truth with a bodyguard of lies). Similar fabrications would fall under Bok’s definition of “paternalistic lies.” The everyday version would be a parent falsely reassuring a child that Mommy and Daddy are not fighting. Another presidential equivalent would be falsely reassuring the citizenry on issues of national security for their own protection. On December 9, 1941, two days after Pearl Harbor, President Roosevelt told Americans in a radio address that they were now in the war “all the way,” while promising to share “together the bad news and good news.” But FDR couldn’t quite bring himself to reveal the extent of the losses in Hawaii, saying that he lacked “sufficient information,” which was not exactly true.

Serious commentators on American public life have seldom questioned a president’s right to lie in such circumstances. But they are starting to. Consider the media hand-wringing triggered by President Bush’s surprise trip to Iraq in November 2003, when he and his aides lied about his Thanksgiving plans in an effort to preserve his safety and that of the troops he was to visit. As with the Vietnam War, the course of events in Iraq is prompting a reevaluation of truth telling for its own sake.

In When Presidents Lie, the liberal political journalist Eric Alterman attempts to raise the bar on when a commander in chief can dissemble. He argues that although wartime may be when presidents can get away with lying—and is perhaps even when they most feel the need to lie—recent American history suggests that it is also when the costs of a lie may be too high.

Alterman’s thesis is that the lies told by Roosevelt during World War II, specifically those concerning the promises he made to Stalin at Yalta, helped set in motion the Cold War, and that unrealistic expectations in the West about the future of Eastern Europe fueled Soviet suspicions of America’s motives. These suspicions, he argues, were stoked by Dwight Eisenhower’s public denials about Francis Gary Powers’s disastrous U-2 spy flight over the Soviet Union. The accompanying loss of U.S. credibility helped foment the Cuban missile crisis. In turn, the fiction that John F. Kennedy and his team of White House hagiographers spread about an uncompromising U.S. stance in October 1962 helped beget Vietnam: ever watchful of his political rival Robert Kennedy, Lyndon Johnson felt compelled to embrace the myth of a hard-line response to Communist adventure—never mind the truth that JFK had traded some NATO missiles in Turkey for Nikita Khrushchev’s missiles in Cuba.

Alterman is considered an ideological man, which I am not, but I believe his revisionist argument is worth taking seriously. I say this not only because Alterman had the intellectual honesty to confront the record of liberal Democrats as well as conservative Republicans who have served as wartime commanders in chief, but also because the war in Iraq has shown the peril of taking a president’s assertions at face value—even if that president believes he is telling the truth. But don’t take it from me. Take it from Dwight Eisenhower. After he left office, Ike described the lies his administration had told about the U-2 incident as one of the biggest regrets of his presidency. “I didn’t realize how high a price we were going to have to pay for that lie,” Eisenhower told David Kraslow of Knight Newspapers. “And if I had to do it all over again, we would have kept our mouths shut.”

But presidents have trouble resisting the short-term gain a lie can afford them. It was a Kennedy administration official who claimed the government’s “right … to lie” to the public. He did so in the context of the Cuban missile crisis, which began with a little lie about Kennedy’s health—press secretary Pierre Salinger announced that the president’s trip to Chicago was being cut short because JFK had a cold. Salinger apparently never understood why anyone would question the cover story concocted to get Kennedy back to Washington in October 1962—just as Scott McClellan was taken aback by questions about the White House’s deceptive statements on Bush’s Thanksgiving trip to Iraq. “The trip certainly, I’m sure, gave a morale boost to the troops,” David Wise told a wire-service reporter who asked his opinion. “The question is, should the government engage in lying in order to essentially … protect a photo op? The answer is, no it shouldn’t. It’s a serious business when government lies, and eventually it does hurt a government and a president’s credibility.”

Wise was prescient when he took on presidential honesty in the 1970s, but his expectations are probably too pure. I was president of the White House Correspondents’ Association when Bush and his aides dissembled about the president’s Thanksgiving plans and took only a small White House press pool to Iraq. I registered no protest upon their return. The security pressures must have been extreme, I reasoned. After all, Bush neglected to tell his own parents about the trip, although they were trekking to Crawford, Texas, for the holiday dinner. On the other hand, if going to Iraq was meant to be an evocative symbol, perhaps the ease with which the White House gave everyone the slip was emblematic as well.

Multimedia:

Click here to listen to Bill Clinton apologizing for conveying a “false impression” about the Lewinsky affair.

It has been said about several recent U.S. presidents that they seem to believe what they are saying, even if what they are saying is not true. This explanation is offered as exculpation, as if presidents are Method actors who deserve credit for their “character motivation.” Something like admiration was present in Bob Kerrey’s description of Bill Clinton as “an unusually good liar.”

Franklin Roosevelt practiced plausible deniability about his own declining health by ignoring his doctors. That’s a profound kind of denial. Ronald Reagan was trained as a dramatic actor in a medium where shortcuts are taken with facts in order, supposedly, to get at a larger truth. Reagan’s methods were internalized by aides who never set foot on a studio lot. In his memoir A Different Drummer: My Thirty Years With Ronald Reagan, Mike Deaver claims that he never once heard Reagan tell a lie, and that he believes it would have been “impossible” for Reagan to do so. “Throughout the entire Iran Contra affair, Reagan believed what he did was right,” Deaver wrote, “and that he was telling the truth to the American people.”

George W. Bush’s aides say similar things about him. Perhaps surprisingly, some of Bush’s harshest critics do as well. David Corn, author of The Lies of George W. Bush, is a longtime acquaintance of mine, and I asked him to consider the following premises:

a) That Bush considers himself a truth teller.

b) That although statements made by Bush as president have proven to be untrue, Bush generally believed they were true when he made them.

c) That even when Bush’s words have been at odds with the facts, you could hook him up to a polygraph machine; he’d still tell you he was telling the truth—and he’d pass.

To me, Corn’s book reads like an anti-Bush polemic, especially when it calls the president’s veracity into question over issues that seem more about Bush’s conservative governance. (Appointing John Ashcroft attorney general, for instance, made a “lie” of Bush’s inaugural call for civility and national unity—under the theory that Ashcroft’s archconservatism undermined any chance of détente with the Democrats.) Corn is scrupulous about the facts, however, and except for offering the caveat that Bush, like other politicians, has “stretched the truth” to help sell key policies of his presidency, Corn didn’t much quarrel with the three postulates.

“So your question is, is it still lying anyway?” Corn said. “What Bush does is that he displays a kind of willful disregard for the truth, which is the moral equivalent of lying. He doesn’t do any due diligence with the facts. Even if you believed something was true [at] the time you said it, it becomes a lie when you don’t act on new information—or correct yourself when you’ve been proven wrong.”

Whatever the president’s original sincerity about his reasons for invading Iraq, he has never to this day really acknowledged his rhetorical excess. Two nights before launching the invasion, he gave this rationale: “Intelligence gathered by this and other governments leaves no doubt that the Iraq regime continues to possess and conceal some of the most lethal weapons ever devised.” In his January 28, 2003, State of the Union address, Bush had told the nation that U.S. intelligence agencies estimated that Saddam Hussein possessed more than 30,000 munitions capable of being armed with chemical agents, and that inspectors had turned up only sixteen of them. In May, two months into the invasion, Bush proclaimed simply: “We have found the weapons of mass destruction.”

What the U.S. Army has unearthed in Iraq in three years are 500 rockets and artillery shells armed with mustard gas or the sarin nerve agent, some of them in degraded condition, buried in scattered bunkers around the country. The Army has found no evidence of an up-and-running Iraqi nuclear-weapons program. In State of Denial, Woodward quotes a December 11, 2003, recorded exchange between him and Bush in which it took the president five minutes and eighteen seconds to acknowledge the failure to find weapons of mass destruction.

Confronted later with their statements, Bush and other top officials in his government tend simply to reiterate them. Other times, they’ve simply denied making these statements, even though they are on film. For instance, when Bush was asked in May 2002 about the hunt for Osama bin Laden, he replied: “I don’t know where he is. I repeat what I said. I truly am not that concerned about him.” Yet on October 13, 2004, during Bush’s third and final debate with John Kerry, the following exchange took place:

Kerry: “Six months after he said Osama bin Laden must be caught dead or alive, this president was asked, ‘Where is Osama bin Laden?’ He said, ‘I don’t know. I don’t really think about him very much. I’m not that concerned.’”

Bush: “Gosh, I just don’t think I ever said I’m not worried about Osama bin Laden. It’s kind of one of those exaggerations.”

In an ABC interview two weeks before the 2006 midterms, George Stephanopoulos asked Bush where the compromise might be on Iraq between the dueling political buzz phrases stay the course and cut and run. Bush’s response: “Well, hey, listen, we’ve never been ‘stay the course,’ George …” Liberal bloggers quickly posted on YouTube a hilarious montage of Bush using that exact phrase repeatedly. “The president of the United States is not a fact-checker,” White House communications director Dan Bartlett blurted out at a July 18, 2003, briefing about what the president knew, and when he knew it, regarding British intelligence reports of Saddam Hussein’s agents prowling around Africa in search of enriched uranium.

No one expects him to be. What they do expect is that a president who takes the nation to war knows what he’s talking about when he enumerates the reasons for that war. Which raises the central question about George W. Bush’s tenure in the White House: Even giving him the benefit of the doubt on honesty, why doesn’t the nation’s first-ever M.B.A. president demonstrate a better command of the facts?

There are three popular theories about Bush’s behavior: that the president is an intellectually incurious man who doesn’t have or want enough information to make informed decisions; that his late-life embrace of religion has given him inner peace, but also a near-absolute level of certitude; and that his demand for total loyalty discourages the give-and-take a leader needs, because aides who proffer advice or information that doesn’t jibe with administration policy are not viewed as team players.

Several White House aides, past and present, say it’s simply wrong to suggest that people around Bush are afraid to bring him bad news or contrary opinions. “I was never intimidated,” Michael Gerson told me. Former press secretary Ari Fleischer said the same thing. On July 2, 2003, when Bush made his infamous “Bring ’em on” taunt, Fleischer told the president pointedly as they walked out of the Roosevelt Room that this statement would offend a military mom with a child serving in Iraq. “I always found it easy to raise objections like that,” Fleischer said. “Easy as a layup.”

Yet Bush has somehow managed to be consistently surprised by events in the war of his own making. In the absence of any plausible explanation from his loyalists, it is his critics who are writing the history of this period. One consistent theme of these critics is that faith trumps fact for Bush when it comes to Iraq. There’s ample supporting evidence for that belief, starting with Bush’s bio-fib to Brit Hume in 2003 that he didn’t read newspapers. Hume was rightly incredulous, and Laura Bush later refuted her husband in an exchange with Jay Leno. Yet in telling this particular lie, Bush may have been revealing an important truth about himself. What he was getting at, apparently, is that he doesn’t read columns and editorials. The reason, he told me and other journalists, was his need to “stay optimistic.”

There can be a thin line between optimism and delusion. During his 2004 reelection campaign, Bush went to a Boeing aerospace plant in Ridley Park, Pennsylvania, to put in a plug for the nation’s fledgling missile-defense system, and asserted, “We say to those tyrants who believe they can blackmail America and the free world, ‘You fire, we’re going to shoot it down.’” Given the current technology, Bush’s statement was a declaration of wishful thinking, not military reality. Bush had made an equally dubious assertion while running in 2000, after several highly publicized exonerations of men on death row had prompted nationwide soul-searching over capital punishment. “Everybody who’s been executed [in Texas] is guilty of the crime of which they’ve been convicted,” Bush said. He may have believed this, but in Austin, Bush presided over more executions than any other governor in modern history, and did so in a state that offers only rudimentary legal services for indigent defendants, enforces strict time limits for post-conviction appeals, and does little in the way of executive-branch or parole-board review of trial-court verdicts. Really, Bush had no way of knowing that what he hoped was true actually was true—and there were empirical reasons to wonder. This example has a recent echo in Bush’s confident-sounding assurance in a September 6, 2006, East Room speech: “We have in place a rigorous process to ensure those held at Guantánamo Bay belong at Guantánamo.”

This kind of blasé optimism has undergirded Bush’s entire policy on Iraq, and its consequences have been grim. As Peter W. Galbraith, author of a new book critical of Bush’s prosecution of the war, put it: “With regard to Iraq, President Bush and his top advisers have consistently substituted wishful thinking for analysis and hope for strategy.” In May 2003, under that now-infamous Mission Accomplished banner aboard the U.S.S. Lincoln, Bush proclaimed,“Iraq is free” and “major combat operations in Iraq have ended.” But almost four years later, scores of Iraqis are still dying every day, the country is in the throes of civil war, and American forces remain enmeshed in a conflict with no clear end. “The strategy was denial,” wrote Woodward in the last paragraph of State of Denial. “With all Bush’s upbeat talk and optimism, he had not told the American public the truth about what Iraq had become.”

Bush’s aides bristle at such words, but when asked why the president refuses to go back and correct his rhetorical mistakes—to level with the American people about where and why he was wrong—they falter or fall back on platitudes. Queried about Bush’s failure to cite what he had ever done wrong, Fleischer answers: “It’s the foolish politician who looks backwards and wallows in his difficulties. A good politician looks forward. It’s the difference between a pessimist and an optimist, between a loser and a winner, between Jimmy Carter and George W. Bush.” In his 2006 State of the Union address, the president voiced the same sentiment in addressing his Iraq War critics: “Hindsight alone is not wisdom,” he said, “and second-guessing is not a strategy.”

But optimism, while more appealing than its opposite number, is not a strategy, any more than hindsight. And it has the added drawback of not offering any Plan B. Regarding the failure to find weapons of mass destruction, Gerson said that the White House staff itself was never told what went wrong. “As opposed to being deceptive, when those weapons were not found I think people were shocked,” he said. “I mean, it was beyond belief.”

Now that the midterm elections are over, some speculate that the administration will be more candid and less dogmatic about Iraq. Yet some results of this administration’s self-deception are not reversible. While researching Fiasco, Thomas Ricks spoke to numerous battlefield commanders in Iraq who left thousands of tons of conventional weapons undisturbed as they raced toward Baghdad in the first heady days of the invasion. They didn’t have enough troops to guard these caches of weapons, and they didn’t dare destroy them, believing as they did that underneath might lie highly dangerous stockpiles of chemical and biological weapons. “So the bunkers were often bypassed and left undisturbed by an invasion force that was already stretched thin,” Ricks wrote. “And the insurgents were able to arm themselves at leisure.”

Multimedia:

Click here to listen to Harry S. Truman calling the city of Hiroshoma a “military base.”

Of course, posterity rewards success, not truth. If D-Day had failed, FDR likely would have been remembered not as a heroic wartime president but as a tragic figure whose self-serving deceptions about his own health prolonged a savage war and jeopardized victory. And if Japan had not surrendered even after atomic bombs were dropped on the civilian populations of two of its cities, Truman might be recalled as a butcher. Conversely, if U.S. forces had found the fabled weapons of mass destruction in Iraq, would Bush’s integrity be under question? Probably not. For presidents, consequences matter more than truth. Bush almost certainly understands this; it may inform his oft-expressed hope of being judged positively in the long sweep of history. Yet today he remains reluctant to reckon not only with his statements but also with their results. President Kennedy may have lied to the public about why the Russians removed their missiles from Cuba, but he knew the truth of the situation well enough to negotiate the compromise that led to their removal. Bush, on the other hand, seems unwilling to recognize that the reality of the situation in Iraq does not conform to his vision of it. The most dangerous lies a president can tell, it would seem, are the lies he tells himself.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A new survey suggests many might prefer a kind of multipolar Washington, with three distinct orbits of power checking each other.

Does Donald Trump have a mandate?

Though last month’s election provided Trump and his fellow Republicans unified control of the White House, House of Representatives, and Senate for the first time since 2006, the latest Allstate/Atlantic Media Heartland Monitor Poll shows the country remains closely split on many of the key policy challenges facing the incoming administration—and sharply divided on whether they trust the next president to take the lead in responding to them.

In addition, on several important choices facing the new administration and Congress, the survey found that respondents who voted for Trump supported a position that was rejected by the majority of adults overall. That contrast may simultaneously encourage Trump to press forward on an agenda that energizes his coalition, while emboldening congressional Democrats to resist him.