It is time to think carefully about the next year. Our position is uniquely promising—and uniquely difficult.

The promise lies in the fact that you are going to win the election. Nothing is guaranteed in politics, but based on everything we know, and barring an act of God or a disastrous error on our side, one year from today you will be sworn in as the forty-sixth president of the United States. And you will be the first president since before the Civil War to come from neither the Republican nor the Democratic Party.1 This is one aspect of your electoral advantage right now: having created our new party, you are already assured of its nomination, whereas the candidates from the two legacy parties are still carving themselves up in their primaries.2

The difficulty, too, lies in the fact that you are going to win. The same circumstances that are bringing an end to 164 years of two-party rule have brought tremendous hardship to the country. This will be the first time since Franklin Roosevelt took office in 1933 that so much is demanded so quickly from a new administration. Our challenge is not just to win the election but to win in a way that gives us a chance to address economic failures that have been fifty years in the making.

That is the purpose of this memo: to provide the economic background for the larger themes in our campaign. Although economic changes will be items one through ten on your urgent "to do" list a year from now, this is not the place to talk about them in detail. There will be plenty of time for that later, with the policy guys. Instead I want to speak here not just as your campaign manager but on the basis of our friendship and shared efforts these past twenty years. Being completely honest about the country's problems might not be necessary during the campaign—sounding pessimistic in speeches would hurt us. But we ourselves need to be clear about the challenge we face. Unless we understand how we got here, we won't be able to find the way out once you are in office.

Politics is about stories—the personal story of how a leader was shaped, the national story of how America's long saga has led to today's dramas. Your personal story needs no work at all. Dwight Eisenhower was the last president to enter office with a worldwide image of competence, though obviously his achievements were military rather than technological. But we have work to do on the national story.

When it comes to the old parties, the story boils down to this: the Democrats can't win, and the Republicans can't govern. Okay, that's an overstatement; but the more nuanced version is nearly as discouraging.

The past fifty years have shown that the Democrats can't win the presidency except when everything goes their way. Only three Democrats have reached the White House since Lyndon Johnson decided to leave. In 1976 they ran a pious-sounding candidate against the political ghost of the disgraced Richard Nixon—and against his corporeal successor, Gerald Ford, the only unelected incumbent in American history. In 1992 they ran their most talented campaigner since FDR, and even Bill Clinton would have lost if Ross Perot had not stayed in the race and siphoned away votes from the Republicans. And in 2008 they were unexpectedly saved by the death of Fidel Castro. This drained some of the pro-Republican passion of South Florida's Cuban immigrants, and the disastrous governmental bungling of the "Cuba Libre" influx that followed gave the Democrats their first win in Florida since 1996—along with the election. But that Democratic administration could turn out to have been America's last. The Electoral College map drawn up after the 2010 census removed votes from all the familiar blue states except California, giving the Republicans a bigger head start from the Sunbelt states and the South.

As for the Republicans, fifty years have shown they can't govern without breaking the bank. Starting with Richard Nixon, every Republican president has left the dollar lower, the federal budget deficit higher, the American trade position weaker, and the U.S. manufacturing work force smaller than when he took office.

The story of the parties, then, is that the American people mistrust the Republicans' economic record, and don't trust the Democrats enough to let them try to do better. That is why—and it is the only reason why—they are giving us a chance. But we can move from electoral to governmental success only with a clear understanding of why so much has gone so wrong with the economy. Our internal polls show that nearly 90 percent of the public thinks the economy is "on the wrong track." Those readings should hold up, since that's roughly the percentage of Americans whose income has fallen in real terms in the past five years.

The story we will tell them begins fifteen years ago,3 and it has three chapters. For public use we'll refer to them by the names of the respective administrations. But for our own purposes it will be clearer to think of the chapter titles as "Cocking the Gun," "Pulling the Trigger," and "Bleeding."

1. Cocking the Gun

Everything changed in 2001. But it didn't all change on September 11.

Yes, the ramifications of 9/11 will be with us for decades, much as the aftereffects of Pearl Harbor explain the presence of thousands of U.S. troops in Asia seventy-five years later. Before 2001 about 12,000 American troops were stationed in the Middle East—most of them in Kuwait and Saudi Arabia. Since 2003 we have never had fewer than 100,000 troops in CENTCOM's theater, most of them on active anti-insurgency duty. The locale of the most intense fighting keeps changing—first Afghanistan and Iraq, then Pakistan and Egypt, now Saudi Arabia and the frontier between Turkey and the Republic of Kurdistan—but the commitment goes on.

Before there was 9/11, however, there was June 7, 2001. For our purposes modern economic history began that day.

On June 7 President George W. Bush celebrated his first big legislative victory. Only two weeks earlier his new administration had suffered a terrible political blow, when a Republican senator left the party and gave Democrats a one-vote majority in the Senate. But the administration was nevertheless able to persuade a dozen Democratic senators to vote its way and authorize a tax cut that would decrease federal tax revenues by some $1.35 trillion between then and 2010.

This was presented at the time as a way to avoid the "problem" of paying down the federal debt too fast. According to the administration's forecasts, the government was on the way to running up $5.6 trillion in surpluses over the coming decade. The entire federal debt accumulated between the nation's founding and 2001 totaled only about $3.2 trillion—and for technical reasons at most $2 trillion of that total could be paid off within the next decade.4 Therefore some $3.6 trillion in "unusable" surplus—or about $12,000 for every American—was likely to pile up in the Treasury. The administration proposed to give slightly less than half of that back through tax cuts, saving the rest for Social Security and other obligations.

Congress agreed, and it was this achievement that the president celebrated at the White House signing ceremony on June 7. "We recognize loud and clear the surplus is not the government's money," Bush said at the time. "The surplus is the people's money, and we ought to trust them with their own money."

If the president or anyone else at that ceremony had had perfect foresight, he would have seen that no surpluses of any sort would materialize, either for the government to hoard or for taxpayers to get back. (A year later the budget would show a deficit of $158 billion; a year after that $378 billion.) By the end of Bush's second term the federal debt, rather than having nearly disappeared, as he expected, had tripled. If those in the crowd had had that kind of foresight, they would have called their brokers the next day to unload all their stock holdings. A few hours after Bush signed the tax-cut bill, the Dow Jones industrial average closed at 11,090, a level it has never reached again.5

In a way it doesn't matter what the national government intended, or why all forecasts proved so wrong. Through the rest of his presidency Bush contended that the reason was 9/11—that it had changed the budget as it changed everything else. It forced the government to spend more, for war and for homeland security, even as the economic dislocation it caused meant the government could collect less. Most people outside the administration considered this explanation misleading, or at least incomplete. For instance, as Bush began his second term the nonpartisan Congressional Budget Office said that the biggest reason for growing deficits was the tax cuts.6

But here is what really mattered about that June day in 2001: from that point on the U.S. government had less money to work with than it had under the previous eight presidents. Through four decades and through administrations as diverse as Lyndon Johnson's and Ronald Reagan's, federal tax revenue had stayed within a fairly narrow band. The tax cuts of 2001 pushed it out of that safety zone, reducing it to its lowest level as a share of the economy in the modern era.7 And as we will see, these cuts—the first of three rounds8—did so just when the country's commitments and obligations had begun to grow.

As late as 2008 the trend could have been altered, though the cuts of 2003 and 2005 had made things worse. But in the late summer of 2008 Senate Republicans once again demonstrated their mastery of the basic feints and dodges of politics. The tax cuts enacted during Bush's first term were in theory "temporary," and set to expire starting in 2010. But Congress didn't have to wait until 2010 to decide whether to make them permanent, so of course the Republican majority scheduled the vote at the most awkward moment possible for the Democrats: on the eve of a close presidential election. The Democratic senators understood their dilemma. Either they voted for the tax cuts and looked like hypocrites for all their past complaints, or they voted against them and invited an onslaught of "tax and spend" attack ads in the campaign. Enough Democrats made the "smart" choice. They held their seats in the election, and the party took back the presidency. But they also locked in the tax cuts, which was step one in cocking the gun.9

The explanation of steps two and three is much quicker: People kept living longer, and they kept saving less. Increased longevity is a tremendous human achievement but a fiscal challenge—as in any household where people outlive their savings. Late in 2003 Congress dramatically escalated the fiscal problem by adding prescription-drug coverage to Medicare, with barely any discussion of its long-term cost. David M. Walker, the government's comptroller general at the time, said that the action was part of "the most reckless fiscal year in the history of the Republic," because that vote and a few other changes added roughly $13 trillion to the government's long-term commitments.

From the archives:

"Spendthrift Nation" (January 2003)
It's a precarious situation: U.S. consumer spending is sustaining the economy—but we need to save more to prepare for the surge in retirements. Here's how to boost personal saving without undermining the economic recovery. By Michael Calabrese and Maya MacGuineas

The evaporation of personal savings was marveled at by all economists but explained by few. Americans saved about eight percent of their disposable income through the 1950s and 1960s, slightly more in the 1970s and 1980s, slightly less and then a lot less in the 1990s. At the beginning of this century they were saving, on average, just about nothing.10

The possible reasons for this failure to save—credit-card debt? a false sense of wealth thanks to the real-estate bubble?11 stagnant real earnings for much of the population?—mattered less than the results. The country needed money to run its government, and Americans themselves weren't about to provide it. This is where the final, secret element of the gun-cocking process came into play: the unspoken deal with China.

The terms of the deal are obvious in retrospect. Even at the time, economists discussed the arrangement endlessly in their journals. The oddity was that so few politicians picked up on what they said. The heart of the matter, as we now know, was this simple equation: each time Congress raised benefits, reduced taxes, or encouraged more borrowing by consumers, it shifted part of the U.S. manufacturing base to China.

Of course this shift had something to do with"unfair" trade, undereducated American workers, dirt-cheap Chinese sweatshops, and all the other things that American politicians chose to yammer about. But the "jobless recovery" of the early 2000s and the "jobless collapse" at the end of the decade could never have occurred without the strange intersection of American and Chinese (plus Japanese and Korean) plans. The Chinese government was determined to keep the value of its yuan as low as possible, thus making Chinese exports as attractive as possible, so that Chinese factories could expand as quickly as possible, to provide work for the tens of millions of people trooping every year to Shanghai or Guangzhou to enter the labor force. To this end, Chinese banks sent their extra dollars right back to the U.S. Treasury, in loans to cover the U.S. budget deficit; if they hadn't, normal market pressures would have driven up the yuan's value.12 This, in turn, would have made it harder for China to keep creating jobs and easier for America to retain them. But Americans would have had to tax themselves to cover the deficit.

From the archives:

"America's 'Suez Moment'" (January 2003)
The growing trade deficit threatens U.S. living standards and makes the country dangerously vulnerable to economic extortion. The way out is to make foreigners act more like us. By Sherle R. Schwenninger

This arrangement was called "Bretton Woods Two," after the regime that kept the world economy afloat for twenty-five years after World War II. The question economists debated was how long it could last. One group said it could go on indefinitely, because it gave each country's government what it really wanted (for China, booming exports and therefore a less dissatisfied population; for America, the ability to spend more while saving and taxing less). But by Bush's second term the warning signals were getting louder. "This is starting to resemble a pyramid scheme," the Financial Times warned early in 2005.13 The danger was that the system was fundamentally unstable. Almost overnight it could go from working well to collapsing. If any one of the Asian countries piling up dollars (and most were doing so) began to suspect that any other was about to unload them, all the countries would have an incentive to sell dollars as fast as possible, before they got stuck with worthless currency. Economists in the "soft landing" camp said that adjustments would be gradual, and that Chinese self-interest would prevent a panic. The "hard landing" camp—well, we know all too well what they were concerned about.

2. Pulling the Trigger

The 2008 election, like those in 2000 and 2004, could have gone either way. If Fidel Castro had died two years earlier, the second Bay of Pigs tragedy and related "regime change" difficulties might have been dim memories by Election Day. Or if he had died a year later, the Cuban-American bloc of Florida voters would have been as reliably Republican in 2008 as in the previous fifty years. Since the red state-blue state divide was otherwise the same as in 2000 and 2004, if the Republicans had held Florida they would presumably have held the White House as well—despite mounting unease about debt, deficits, job loss, and rising U.S. casualties in Pakistan.

But by dying when he did, at eighty-two, and becoming the "October surprise" of the 2008 campaign, Castro got revenge on the Republicans who had for years supported the Cuban trade embargo. Better yet, he got revenge on his original enemies, the Democrats, too.14 Castro couldn't have planned it, but his disappearance was the beginning—the first puff of wind, the trigger—of the catastrophe that followed.

Or perhaps we should call it the first domino to fall, because what then happened had a kind of geometric inevitability. The next domino was a thousand miles across the Caribbean, in Venezuela. Hugo Chavez, originally elected as a crusading left-winger, was by then well into his role as an outright military dictator. For years our diplomats had grumbled that Chavez was "Castro with oil," but after the real Castro's death the comparison had new meaning. A right-wing militia of disgruntled Venezuelans, emboldened by the news that Castro was gone, attempted a coup at the beginning of 2009, shortly after the U.S. elections. Chavez captured the ringleaders, worked them over, and then broadcast their possibly false "confession" that they had been sponsored by the CIA. That led to Chavez's "declaration of economic war" against the United States, which in practice meant temporarily closing the gigantic Amuay refinery, the source of one eighth of all the gasoline used on American roads—and reopening it two months later with a pledge to send no products to American ports.

From the archives:

"The Fuel Subsidy We Need" (January 2003)
Oil dependence is still the Achilles' heel of the American empire. It doesn't have to be—and if we don't want to lose economic ground to Europe, it can't be. By Ricardo Bayon

That was when the fourth—and worst—world oil shock started.15 For at least five years economists and oilmen alike had warned that there was no "give" in the world oil market. In the early 2000s China's consumption was growing five times as fast as America's—and America was no slouch. (The main difference was that China, like India, was importing oil mainly for its factories, whereas the United States was doing so mainly for its big cars.16) Even a temporary disruption in the flow could cause major dislocations.

All the earlier oil shocks had meant short-term disruptions in supply (that's why they were "shocks"), but this time the long term was also in question. Geologists had argued about "peaking" predictions for years, but the concept was on everyone's lips by 2009.17

The Democrats had spent George Bush's second term preparing for everything except what was about to hit them. Our forty-fourth president seemed actually to welcome being universally known as "the Preacher," a nickname like "Ike" or "Honest Abe." It was a sign of how much emphasis he'd put on earnestly talking about faith, family, and firearms to voters in the heartland, in his effort to help the Democrats close the "values gap." But he had no idea what to do (to be fair, the man he beat, "the Veep," would not have known either) when the spot price of oil rose by 40 percent in the week after the Chavez declaration—and then everything else went wrong.

Anyone who needed further proof that God is a Republican would have found it in 2009. When the price of oil went up, the run on the dollar began. "Fixed exchange rates with heavy intervention—in essence, Bretton Woods Two] have enormous capacity to create an illusory sense of stability that could be shattered very quickly," Lawrence Summers had warned in 2004. "That is the lesson of Britain in 1992, of Mexico in 1994, of emerging Asia in 1997, of Russia in 1998, and of Brazil in 1998." And of the United States in 2009. It didn't help that Hugo Chavez had struck his notorious then-secret deal with the Chinese: preferential future contracts for his oil, which China needed, in return for China's backing out of Bretton Woods Two, which Chavez wanted.

There had been hints of how the falling dominoes would look as early as January of 2005. In remarks made at the World Economic Forum in Davos, Switzerland, Fan Gang, the director of China's nongovernmental National Economic Research Institute, said that "the U.S. dollar is no longer seen as a stable currency."18 This caused a quick flurry in the foreign-exchange markets. It was to the real thing what the World Trade Center car bomb in 1993 was to 9/11.

When we read histories of the late 1920s, we practi- cally want to scream, Stop! Don't buy all that stock on credit! Get out of the market before it's too late! When we read histories of the dot-com boom in the late 1990s, we have the same agonizing sense of not being able to save the victims from themselves: Don't take out that home-equity loan to buy stocks at their peak! For God's sake, sell your Cisco shares when they hit 70, don't wait till they're back at 10!

In retrospect, the ugly end is so obvious and inevitable. Why didn't people see it at the time? The same clearly applies to what happened in 2009. Economists had laid out the sequence of causes and effects in a "hard landing," and it worked just as they said it would.

Once the run on the dollar started, everything seemed to happen at once. Two days after the Venezuelan oil shock the dollar was down by 25 percent against the yen and the yuan. Two weeks later it was down by 50 percent. By the time trading "stabilized," one U.S. dollar bought only 2.5 Chinese yuan—not eight, as it had a year earlier.19

As the dollar headed down, assets denominated in dollars suddenly looked like losers. Most Americans had no choice but to stay in the dollar economy (their houses were priced in dollars, as were their savings and their paychecks), but those who had a choice unloaded their dollar holdings fast.20 The people with choices were the very richest Americans, and foreigners of every sort. The two kinds of assets they least wanted to hold were shares in U.S.-based companies, since the plummeting dollar would wipe out any conceivable market gains, and dollar-based bonds, including U.S. Treasury debt. Thus we had twin, reinforcing panics: a sudden decline in share prices plus a sudden selloff of bonds and Treasury holdings. The T-note selloff forced interest rates up, which forced stock prices further down, and the race to the bottom was on.

Because interest rates had been so low for so long, much of the public had forgotten how nasty life could be when money all of a sudden got tight.21 Every part of the cycle seemed to make every other part worse.

Businesses scaled back their expansion or investment plans, since borrowed money was more expensive. That meant fewer jobs. Mortgage rates went up, so buyers who might have bid on a $400,000 house could now handle only $250,000. That pushed real-estate values down; over time the $400,000 house became a $250,000 house. Credit-card rates were more onerous, so consumers had to cut back their spending. Some did it voluntarily, others in compliance with the Garnishee Amendments to the Bankruptcy Act of 2008. Businesses of every sort had higher fixed costs: for energy, because of the oil-price spike; for imported components, because of the dollar's crash; for everything else, because of ripple effects from those changes and from higher interest rates. Those same businesses had lower revenues, because of the squeeze on their customer base. Early in Bush's second term economists had pointed out that the U.S. stock indexes were surprisingly weak considering how well U.S. corporations had been doing.22 The fear of just these developments was why.

Americans had lived through a similar self-intensifying cycle before—but not since the late 1970s, when many of today's adults were not even born. Back in those days the sequence of energy-price spike, dollar crash, interest-rate surge, business slowdown, and stock-market loss had overwhelmed poor Jimmy Carter—he of the promise to give America "a government as good as its people." This time it did the same to the Preacher, for all his talk about "a new Democratic Party rooted in the oldest values of a free and faithful country." When he went down, the future of his party almost certainly went with him.

The spate of mergers and acquisitions that started in 2010 was shocking at the time but looks inevitable in retrospect. When the CEOs of the three remaining U.S. airlines had their notorious midnight meeting at the DFW Hilton, they knew they were breaking two dozen antitrust laws and would be in financial and legal trouble if their nervy move failed. But it worked. When they announced the new and combined AmFly Corporation, regulators were in no position to call their bluff. At their joint press conference the CEOs said, Accept our more efficient structure or we'll all declare bankruptcy, and all at once. The efficiencies meant half as many flights (for "fuel conservation") as had been offered by the previously competing airlines, to 150 fewer cities, with a third as many jobs (all non-union).23 Democrats in Congress didn't like it, nor did most editorialists, but the administration didn't really have a choice. It could swallow the deal—or it could get ready to take over the routes, the planes, the payrolls, and the passenger complaints, not to mention the decades of litigation.

Toyota's acquisition of General Motors and Ford, in 2012, had a similar inevitability. Over the previous decade the two U.S. companies had lost money on every car they sold. Such profit as they made was on SUVs, trucks, and Hummer-style big rigs. In 2008, just before the oil shock, GM seemed to have struck gold with the Strykette—an adaptation of the Army's Stryker vehicle, so famous from Iraq and Pakistan, whose marketing campaign attracted professional women. Then the SUV market simply disappeared. With gasoline at $6 a gallon, the prime interest rate at 15 percent, and the stock and housing markets in the toilet, no one wanted what American car makers could sell.24 The weak dollar, and their weak stock prices, made the companies a bargain for Toyota.25

For politicians every aspect of this cycle was a problem: the job losses, the gasoline lines, the bankruptcies, the hard-luck stories of lifetime savings vanishing as the stock market headed down. But nothing matched the nightmare of foreclosures.

For years regulators and financiers had worried about the "over-leveraging" of the American housing market. As housing prices soared in coastal cities, people behaved the way they had during the stock-market run-up of the 1920s: they paid higher and higher prices; they covered more and more of the purchase price with debt; more and more of that debt was on "floating rate" terms—and everything was fine as long as prices stayed high and interest rates stayed low.

When the market collapsed, Americans didn't behave the way economic theory said they should.26 They behaved the way their predecessors in the Depression had: they stayed in their houses, stopped paying their mortgages, and waited for the banks to take the next step. Through much of the Midwest this was a manageable problem: the housing market had gone less berserk to begin with, and, as in the Great Depression, there was a longer-term, more personal relationship between customers and financiers. But in the fastest-growing markets—Orlando, Las Vegas, the Carolina Research Triangle, northern Virginia—the banks simply could not wait. The deal brokered at the White House Security-in-Shelter Summit was ingenious: federal purchase of one million RVs and mobile homes, many of them built at idle auto or truck factories; subsidies for families who agreed to leave foreclosed homes without being evicted by marshals, such that they could buy RVs with no payments for five years; and the use of land at decommissioned military bases for the new RV villages. But it did not erase the blogcam live broadcasts of families being evicted, or the jokes about the "Preachervilles" springing up at Camp Lejeune, the former Fort Ord, and the Philadelphia naval shipyard.

Here is how we know that a sitting president is going to lose: he is seriously challenged in his own party's primaries.27 So if the economic tailspin had left any doubts about the prospects for the Preacher and his party, they were removed by the clamor to run against him in the Democratic primaries of 2012. The party's biggest names were all there: the senators from New York, Illinois, and Florida; the new governors of California and Pennsylvania; the mayor of New York, when it looked as if the Olympic Games would still be held there that fall; and the actor who in his three most recent films had captured Americans' idea of how a president should look and sound, and who came closest to stealing the nomination from the incumbent.

He and the rest of them were probably lucky that their campaigns fell short—not that any politician ever believes that. The Democratic nomination in 2012 was obviously a poisoned chalice, but a politician can't help thinking that a poisoned chalice is better than no chalice at all. The barrier none of them could have overcome was the financial crisis of state and local government.

All that befell the federal budget during the collapse of 2009-2012 happened to state and local governments, too, but more so. They had to spend more—on welfare, Medicaid, jails, police officers—while taking in less. One by one their normal sources of funding dried up.28 Revenues from the multi-state lottery and the FreedomBall drawings rose a bit. Unfortunately, the surge of spending on casino gambling in forty-three states and on legalized prostitution in thirty-one didn't benefit state and local governments, because except in Nevada those activities were confined to Indian reservations, and had only an indirect stimulative effect.

And many governors and mayors faced a reality the president could avoid: they operated under constitutions and charters that forbade deficit spending. So they had no practical choice but to tighten the clamps at both ends, cutting budgets and raising taxes. The process had begun before the crash, as politicking in most state capitols was dominated by "intractable" budget disputes.29 When the downturn really hit, even governors who had never heard of John Maynard Keynes sensed that it was a bad idea to raise taxes on people who were being laid off and evicted. But they were obliged by law to balance their budgets. All mayors and governors knew that it would be dicey to renege on their basic commitments to education, public safety, public health, and public infrastructure. But even in hindsight it is hard to know what else they could have done. California did too much too fast in closing sixty-three of its 110 community colleges30 and imposing $9,500 annual "user fees" in place of the previous nominal fees. Its solution to the financing crisis on its high-end campuses was defter—especially the "Great Pacific Partnership" between the University of California and Tsinghua University, in Beijing. This was a win-win arrangement, in which the Chinese Ministry of Education took over the funding of the UC Berkeley physics, computer-science, and biology laboratories, plus the genomics laboratory at UC San Francisco, in exchange for a 51 percent share of all resulting patents.

State and local governments across the country did what they could. Fee-for-service became the norm—first for "enrichment" programs in the schools, then to underwrite teachers' salaries, then for emergency police calls, then for inclusion in routine police and fire patrols. First in Minnesota, soon after in Michigan, New York, and Pennsylvania, there were awkward moments when the governor, exercising his power as commander in chief of the state National Guard, ordered the Guard's medical units to serve in hospitals that had furloughed nurses and emergency-room doctors. The Democratic president decided not to force the question of who had ultimate control over these "citizen soldiers." This averted a showdown in the short term, but became one more attack point for the Republicans about weak and vacillating Democrats. Cities within 150 miles of the Mexican border opened police-service and trash-hauling contracts to companies based in Mexico. The state of Georgia, extending a practice it had begun in the early 2000s, said that it would hire no new public school teachers except under the "Partnership for Excellence" program, which brought in cut-rate teachers from India.31

The chaos in public services spelled the end for the administration, and for the Democratic Party in the long run. The Democrats couldn't defend the unions. They couldn't defend pensioners. They couldn't even do much for their limousine liberals. The nation had never been more in the mood for firm leadership. When the "Desert Eagle" scored his astonishing coup in the Saudi Arabian desert just before Christmas of 2011, America knew who its next leader would be. For a four-star general to join his enlisted men in a nighttime HALO32 special-operations assault was against all established practice. The Eagle's determination to go ahead with the stunt revealed him to be essentially a MacArthuresque ham. But the element of surprise was total, and the unit surrounded, captured, and gagged Osama bin Laden before he was fully awake.

The general's news conference the next day had the largest live audience in history, breaking the record set a few months earlier by the coronation of England's King William V. The natural grace of this new American hero was like nothing the world had seen since Charles Lindbergh landed in Paris. His politics were indistinct, but if anything, that was a plus. He was strong on defense; urgent (without details) about "fighting smart against our economic enemies"; and broadly appealing on "values"—a devout Catholic who had brought the first openly gay commandos into a front-line combat unit. ("When we were under fire, I never asked who they loved, because I knew they loved our flag.") Political pros had always assumed that America's first black president would be a Republican and a soldier, and they were right. He just didn't turn out to be Colin Powell.

The only suspense in the election was how big the win would be. By Labor Day it was clear that the Democrats might lose even the District of Columbia, whose rich residents were resentful about their ravaged stock portfolios, and whose poor residents had been cut off from Medicaid, welfare, and schools. As the nation went, so went the District, and after fifty-seven presidential elections the United States had its first across-the-board electoral sweep.

3. Bleeding

The emergencies are over. As our current president might put it, it's a war of attrition now. His administration hasn't made anything worse—and we have to admit that early on his ease and confidence were like a balm. But he hasn't made anything better, either. If not fully tired of him, the public has grown as fatalistic about the Republicans' ability to make any real difference as it already was about the Democrats'. The two-party system had been in trouble for decades. It was rigid, polarizing, and unrepresentative. The parties were pawns of special interests. The one interest group they neglected was the vast center of the American electorate, which kept seeking split-the-difference policies. Eight years of failure from two administrations have finally blown apart the tired duopoly. The hopes of our nation are bleeding away along with our few remaining economic resources.

Here is the challenge:

Our country no longer controls its economic fundamentals.

Compared with the America of the past, it has become stagnant, classbound, and brutally unfair.

Compared with the rest of the world, it is on the way down. We think we are a great power—and our military is still ahead of China's. Everyone else thinks that over the past twenty years we finally pushed our luck too far.

To deal with these problems once in office, we must point out basic truths in the campaign.

These truths involve the past sources of our growth: savings, investment, education, innovation. We've thrown away every one of these advantages. What we would do right now to have back the $1 trillion that Congress voted away in 2008 with the Freedom From Death Tax Act!33 A relatively small share of that money might have kept our aerospace programs competitive with Europe's34—to say nothing of preparing us for advances in other forms of transportation. A little more might have made our road and highway system at least as good as China's.35 With what was left over, our companies might have been able to compete with Germany's in producing the superfast, quiet, efficient maglev trains that are now doing for travel what the jet plane did in the 1950s. Even if we couldn't afford to make the trains, with more money at least some of our states and regions might have been able to buy them, instead of just looking enviously at what China, India, and Iran have done.36

Or we could have shored up our universities. True, the big change came as early as 2002, in the wake of 9/11, when tighter visa rules, whatever their effect on reducing terrorism, cut off the flow of foreign talent that American universities had channeled to American ends.37 In the summer of 2007 China applied the name "twenty Harvards"to its ambition, announced in the early 2000s, to build major research institutions that would attract international talent. It seemed preposterous (too much political control, too great a language barrier), but no one is laughing now. The Chinese mission to Mars, with astronauts from Pakistan, Germany, and Korea, indicates the scope of China's scientific ambition. And necessity has pushed China into the lead in computerized translation technology, so that foreign students can read Chinese characters. The Historic Campus of our best-known university, Harvard, is still prestigious worldwide. But its role is increasingly that of the theme park, like Oxford or Heidelberg, while the most ambitious students compete for fellowships at the Har-Bai and Har-Bei campuses in Mumbai and Beijing. These, of course, have become each other's main rivals—whether for scores on the World Ingenuity Test or in the annual meeting of the teams they sponsor at the Rose Bowl.

Or we could at last have begun to grapple with health-care costs. We've managed to create the worst of all worlds—what the Democrats call the "30-30 problem." Thirty percent of our entire economy goes for health and medical costs,38 but 30 percent of our citizens have no regular contact with the medical system. (Except, of course, during quarantines in avian-flu season.) For people who can afford them, the "tailored therapies" of the past decade represent the biggest breakthrough in medicine since antibiotics or anesthesia. The big killers—heart disease and cancers of the colon, lung, breast, and prostate—are now manageable chronic diseases at worst, and the big moral issues involve the question of whether Baby Boomers are living "too long." But the costs are astronomical, which raises questions of both efficiency and justice. Google's embedded diagnostic technology dramatizes our problem: based on nonstop biometric testing of the thirty-seven relevant enzymes and organ-output levels, it pipes into cell-phone implants instructions for which treatment, pill, or action to take next. The system is extremely popular—for the 10 million people who can afford it. NetJet flights to the Bahamas for organ replacement illustrate the point even more sharply, although here the breakthrough was less medical than diplomatic. The World Trade Organization, after the most contentious proceeding in its history, ruled that prohibiting commerce in human organs for transplant was an unjust trade barrier. The ruling may have caused the final, fatal split in the Republican Party (libertarians were jubilant, religious conservatives appalled), but it became the foundation of an important Caribbean industry after threats of violence dissuaded many transplant centers from operating within the United States. Meanwhile, despite the Strong America-Strong Americans Act of 2009, which tied income-tax rates to body-mass index and cigarette consumption, smoking and eating junk food have become for our underemployed class what swilling vodka was for the dispossessed in Boris Yeltsin's Russia.

All these issues involve money, and we can't avoid talking about money in this campaign. But your ability to address an even harder issue will largely determine whether you can succeed in the job the voters are about to give you.

That problem is the sense of sunset, decline, hopelessness. America has been so resilient as a society because each American has imagined that the sky was the limit. Obviously it was not for everyone, or always. From the beginning we've had a class system, and a racial-caste system, and extended periods—the 1890s, the 1930s, the 1970s, the past few years—when many more people than usual were struggling merely to survive. But the myth of equal opportunity has been closer to reality here than in any other society, and the myth itself has mattered.

My father, in explaining why it was so painful for him to see a lifetime's savings melt away after the Venezuelan crisis, told me about a political speech he remembered from his own youth. It was by Daniel Patrick Moynihan, a Harvard professor who later became a politician. In the late 1960s, when American prosperity held despite bitter political turmoil, Moynihan told left-wing students why preserving that prosperity should be important even to them. We know Europe from its novels, Moynihan said: the old ones, by Austen and Dickens and Stendahl, and the more recent ones, too. We know it as a static society. Young people, seeking opportunity, have to wait for old people to die. A whole life's prospects depend on the size of an inheritance. People know their place. America, Moynihan said fifty years ago, must never become a place like that.

That is the place we have become. Half this country's households live on less than $50,000 a year. That sounds like a significant improvement from the $44,000 household median in 2003. But a year in private college now costs $83,000, a day in a hospital $1,350, a year in a nursing home $150,000—and a gallon of gasoline $9. Thus we start off knowing that for half our people there is no chance—none—of getting ahead of the game. And really, it's more like 80 percent of the public that is priced out of a chance for future opportunity. We have made a perfect circle—perfect in closing off options. There are fewer attractive jobs to be had, even though the ones at the top, for financiers or specialty doctors, are very attractive indeed. And those who don't start out with advantages in getting those jobs have less and less chance of moving up to them.

Jobs in the middle of the skill-and-income distribution have steadily vanished if any aspect of them can be done more efficiently in China, India, or Vietnam. The K-12 schools, the universities, the ambitious research projects that could help the next generation qualify for better jobs, have weakened or dried up.39 A dynamic economy is always losing jobs. The problem with ours is that we're no longer any good at creating new ones. America is a less attractive place for new business because it's a less attractive place, period.40

In the past decade we've seen the telephone companies disappear. Programming, data, entertainment, conversation—they all go over the Internet now. Pharmaceuticals are no longer mass-produced but, rather, tailored to each patient's genetic makeup. The big airlines are all gone now, and much of publishing, too. The new industries are the ones we want. When their founders are deciding where to locate, though, they'll see us as a country with a big market—and with an undereducated work force, a rundown infrastructure, and a shaky currency. They'll see England as it lost its empire. They'll see Russia without the oil reserves, Brezhnev's Soviet Union without the repression. They'll see the America that Daniel Patrick Moynihan feared.

This story is now yours to tell, and later I'll turn to notes for the stump speech. But remember that the reality of the story reaches backward, and that is why I have concentrated on the missed opportunities, the spendthrift recklessness, the warnings America heard but tuned out. To tell it that way in public would of course only make things worse, and we can't afford the recriminations or the further waste of time. The only chance for a new beginning is to make people believe there actually is a chance.

1. The last one was Millard Fillmore, a Whig. We will not emphasize this detail.

2. Also, though I never thought I'd say it, thank God for the Electoral College. In only two states, Michigan and Maine, are you polling above 50 percent of the total vote—in Michigan because of the unemployment riots, in Maine because that's what they're like. But you will probably have a strong plurality in at least forty other states, yielding a Reagan-scale electoral-vote "mandate."

3. Nothing in history ever quite "begins." Did America's problems with militant Islam begin in 2001? Or twenty years earlier, when we funded the anti-Soviet mujahideen in Afghanistan, who later turned their weapons against us? Or sixty years before that, with the breakup of the Ottoman Empire after World War I? Or during the Crusades? Similarly, warning signs of today's economic problems were apparent in the mid-1960s. But the big change started fifteen years ago, at the beginning of this century.

4. The federal debt consists of bills, notes, and bonds that come due at different periods—thirteen weeks, five years, twenty years. The main way to retire debt is to pay off holders on the due date. Only $2 trillion worth of debt would have matured within a decade, so only that much could be paid off. That is why the Bush administration's first budget message said, "Indeed, the President's Budget pays down the debt so aggressively that it runs into an unusual problem—its annual surpluses begin to outstrip the amount of maturing debt starting in 2007."

5. In 2005 Ben White, of The Washington Post, noted the coincidence of the Dow's peak and Bush's signing of the tax-cut bill.

6. Late in January of 2005 the CBO calculated that policy changes during Bush's first term had increased the upcoming year's deficit by $539 billion. Of that amount about 37 percent could be attributed to warfare, domestic security, and other post-9/11 commitments; 48 percent resulted from the tax cuts; and the rest came from other spending increases.

7. This CBO chart illustrates the pattern. The big dive is the result of the 2001 and 2003 tax cuts.

From 1962 to 2002, when federal revenues were low they were around 17.5 percent of GDP, and when they were high they neared 20 percent. Once, they went even higher: to 20.8 percent in Clinton's last year, driven there by higher tax rates and by capital-gains revenue from the bubble economy. The 2001 changes pushed tax receipts down toward 16 percent—the lowest level since 1959.

8. In 2003 Congress approved a second round of tax cuts. In 2005, after a fifty-fifty deadlock, the Senate failed to enact a "pay as you go" provision, which would have required the administration to offset any tax cuts or spending increases by savings in the budget.

9. Through the early 2000s the Government Accountability Office issued warnings about the consequences of extending the tax cuts. This chart, from 2004, showed what would happen to the budget if the tax cuts were locked in.

Its main point was that the basic operating costs of the federal government (interest payments, Social Security, and Medicare and Medicaid—the unglamorous long-term payments it is legally committed to make) were growing, and the money to cover them was not. As the GAO had predicted, our tax revenue in 2015 left only a small margin after covering fixed costs. From that remainder comes the Pentagon, the national parks, and everything else. Soon revenues won't cover even the fixed costs.

10. "In the last year, the net national savings rate of the United States has been between one and two percent," the economist and then president of Harvard Lawrence Summers said in 2004, a year before the rate hit its nadir. "It represents the lowest net national savings rate in American history and, I believe, that of any major nation." Summers gave the speech five years after his appointment as Treasury secretary and five years before his nomination as chairman of the Federal Reserve Board.

11. Robert Shiller, an economist at Yale, was ahead of most other observers in predicting the collapse of the tech-stock bubble of the 1990s and the personal-real-estate bubble a decade later. In a paper for the National Bureau of Economic Research, published in 2001, he and two colleagues observed that the housing boom intensified the savings collapse. Every time homeowners heard that a nearby house had sold for an astronomical price, they felt richer, even if they had no intention of selling for years. That made them more likely to go out and spend their theoretical "gains"—and not to bother saving, since their house was doing it for them. "The estimated effect of housing market wealth on consumption is significant and large," Shiller and his colleagues concluded. If people felt rich, they spent that way.

12. As background for the speechwriters, here is the longer version of what was happening.

In normal circumstances economic markets have a way of dealing with families, companies, or countries that chronically overspend. For families or companies that way is bankruptcy. For countries it is a declining currency. By normal economic measures the American public was significantly overspending in the early 2000s. For every $100 worth of products and services it consumed, it produced only about $95 worth within our borders. The other $5 worth came from overseas. Normally an imbalance like this would push the dollar steadily down as foreigners with surplus dollars from selling oil or cars or clothes in America traded them for euros, yuan, or yen. As demand for dollars fell and their value decreased, foreign goods would become more expensive; Americans wouldn't be able to afford as many of them; and ultimately Americans would be forced to live within the nation's means.

That is in fact what happened in America's trade with Europe—and to a large extent with the oil-producing world. The euro skyrocketed in value against the dollar, and oil prices—which until the crisis of 2009 were fixed in dollars—went up too, which preserved Saudi and Kuwaiti buying power for European goods.

It didn't work this way with China. Americans bought and bought Chinese goods, and Chinese banks piled up dollars—but didn't trade them back for yuan. Instead China's central bank kept the yuan-to-dollar exchange rate constant and used the dollars to buy U.S. Treasury notes. That is, they covered the federal budget deficit. (Since Americans, on average, were saving nothing, they couldn't cover it themselves.) To a lesser extent Korean and Japanese banks did the same thing.

This was different from the situation in the 1980s and 1990s, when foreigners earned dollars from their exports and used those dollars to buy American companies, real estate, and stock. In those days foreigners invested heavily in America because the payoff was so much greater than what they could get in Frankfurt or Tokyo. In an influential paper published in 2004 the economists Nouriel Roubini, of New York University, and Brad Setser, of Oxford University, demonstrated that this was no longer the case. Increasingly it was not individuals or corporations but foreign governments—in particular, state-controlled banks in Asia—that were sending money to America. And America was using it to finance the federal budget deficit.

13. The paper used this chart to show how foreign money was supporting U.S. spending.

14. We now know from the memoirs of his eldest son, Fidelito, that Castro never moderated his bitter view of the Kennedy brothers—Jack for authorizing the Bay of Pigs invasion, Bobby for encouraging the CIA to assassinate Castro—and, by extension, their Democratic Party. Castro told his children that if the United States and Cuba ever reconciled, he dreamed of doing two things: throwing an opening-day pitch at Yankee Stadium, and addressing a Republican convention in prime time. (From Mi Papa: The Castro I Knew, Las Vegas: HarperCollins, 2009.)

15. The first one, starting in 1973, transformed the world more than most wars do. It empowered OPEC; enriched much of the Middle East; brought on five years of inflation, slow growth, and stock-market stagnation in the United States; pushed Japan toward a radically more energy-efficient industry; and more. The second, after the Iranian revolution of 1979, caused the inflation that helped drive Jimmy Carter from office, and spilled over into the recession of Ronald Reagan's first two years. The third, after Iraq's 1990 invasion of Kuwait, disrupted world trade enough to lay the groundwork for Bill Clinton's "It's the economy, stupid" attack against George H.W. Bush. And seven years after the shock of 2009 began, we are still feeling its effects.

16. After the first oil shock U.S. oil consumption actually fell in absolute terms. In 1973, as the first shock began, Americans consumed 35 "quads," or quadrillion BTUs, of oil. Ten years later, with a larger population and a stronger economy, they consumed only 30. But from that point on total consumption moved back up. In 2003 Americans consumed 39 quads—and two thirds of that oil was for transportation. Consumption for most other purposes, notably heating and power generation, actually went down, thanks to more-efficient systems. Industrial consumption was flat. So bigger cars and longer commutes did make the difference.

17. Every oil field follows a pattern of production: Its output rate starts slow and keeps getting faster until about half the oil has been pumped from the field. Then the rate steadily declines until the other half of the oil is gone. Since total world production is the aggregate of thousands of fields, it is presumed to follow a similar pattern. In 2005 the research and engineering firm SAIC released a report commissioned by the U.S. government on best guesses about the worldwide peak and what would happen when it came. "No one knows with certainty when world oil production will reach a peak," the report said, "but geologists have no doubt that it will happen." Of the twelve experts surveyed for the report, six predicted that the peak would have occurred before 2010, and three more that it would happen by 2020.

The world was not going to "run out" of oil—at least not immediately. Even at the peak, by definition, as much as had ever been pumped in history was still there to be extracted. But the rate of production, barrels per day and per year, would steadily lessen while the rate of demand kept increasing. The report was released when oil crossed $50 a barrel; we are long into the era of oil at 30 euros, or $90.

18. That turned out to be the next-to-last convening of the Davos conference, before the unproven but damaging accusations that it was a front for the A. Q. Khan combine.

19. What happened to America almost exactly repeated what had happened ten years earlier to Thailand, Indonesia, and other countries during the Asian panic of 1997-1998. South Korea lost 50 percent of the value of its currency in two months; Indonesia lost 80 percent over the course of a year. As in America, the collapse of each currency led to equally deep stock-market declines. The Asian crash also turned into a foreign-policy nightmare for the United States, with Prime Minister Mahathir of Malaysia leading the denunciation of U.S.-based financiers, including the "moron" George Soros, for the "criminal" speculations that destroyed the economies of smaller nations like his. Since Malaysia and Indonesia are largely Muslim, and the financiers could be cast as part of the great shadowy U.S.-Zionist cabal, the crash worsened U.S. relations with the Islamic world.

20. Once the foreigners knew that the dollar had hit bottom, they came back to buy shares at bargain prices. But the currency run of 2009 showed the same pattern as the tech-stock crash of 2000 and, indeed, the generalized market panic of the 1930s: prices stayed depressed for years, because investors who had suffered heavy losses were understandably slow to return.

21. Let's make up flash cards for the speechwriters, so they are clear about the role of interest rates.

The most important thing that goes up when interest rates rise is the value of the dollar. We'll save the cause and effect for our policy guys, but make sure the writers have these points straight.

For the speechwriters' benefit, let's spell this out too: Why did the dollar panic raise interest rates? Two related reasons. First, interest rates are ultimately set by supply and demand. If the Treasury can't sell enough notes at four percent to cover the deficit, it will keep raising the rate—to five, six, ten percent—until it gets the money it needs. Second, the main way a government can keep up the value of its currency is to raise interest rates, hoping to attract investments that would otherwise be made in yuan, euros, or yen.

22. In the spring of 2005, as stock averages slid week by week, W. Bowman Cutter, a managing partner of the investment-banking firm Warburg Pincus, asked, "Why are we not in a bull market now?" He said that if you looked at the traditional measures of economic strength—high corporate investment, rapid productivity improvements, strong overall growth rates—"you would have to say that 2004 was the best year of the past twenty." Interest rates at the time were still very low. "If you transposed this to any other era in history," Cutter said, "you would have a very strong bull market. Why not now? Because the market is looking to the long-term structural problems." If the market couldn't go up when conditions were promising, it had no cushion when the crisis began.

23. Jobs in the airline industry had been plummeting for years. In 2000 the eight largest carriers employed 432,000 people. Four years later a third of those jobs were gone. That meant the loss of 136,000 mainly unionized, mainly high-wage jobs, offset by a small increase in lower-paid jobs at regional and discount airlines.

24. U.S. auto companies and the U.S. auto-buying public suffered in different ways from the "slowness" of America's industry compared with Japan's, China's, and Korea's. It took Detroit companies three years to shift production from trucks and SUVs to hybrid cars; by that time the Asian brands owned the market. Also, it took the American fleet as a whole a surprisingly long time to change. The average car on America's roads is nine years old, and in the course of a decade only half of all cars are replaced. It takes a long time to work the older gas-guzzlers out of the system.

25. The rising value of the euro and the troubled state of the airline market might well have made Boeing a similar target for the new Airbus-Mitsubishi consortium—but for the Transformational Air Mobility Industrial Base Act of 2011, which converted Boeing's factories to national-defense production facilities on a par with Navy shipyards.

26. Through the boom years speculators would borrow the entire cost of a house. If they could "flip" it in a year or two, the profit on the sale would offset the interest they'd paid. But after mortgage rates "floated" up above 10 percent, the calculation changed. The house's value was heading down, and the cost of covering the mortgage was heading up. If the house were just another asset, the rational choice would be to move out and give it back to the bank. But houses aren't normal assets, and that's not what people did.

27. The pattern goes back to the very beginning of the modern primary system, after World War II, and it has no exceptions. If an incumbent faces a serious, vote-getting rival for his party's nomination, he goes on to lose the White House. If not, he stays in.

28. State and local governments tax income, which was falling; property, whose value was plummeting; and retail sales, which were down as well. The blue states were somewhat cushioned against the shocks in comparison with the many red states that had declined to impose state income taxes. Those states depended on property taxes, a fast-disappearing revenue source. Also, since the Nixon years red and blue states alike had relied on federal revenue sharing. This was slashed as part of the Emergency Budget Act of 2012.

29. In 2002 the Rockefeller Institute of Government projected budget trends for the states through 2010, and found that forty-four of them were headed for long-term deficits like the ones plaguing the federal government. The difference, again, is that many states were obliged to change their policies to avoid the deficits.

30. This accelerated a trend that had begun a decade earlier in California. For instance, when the 2003 school year began, some 175,000 students could not find space in community colleges—which, like K-12 public schools, had previously offered enrollment to all eligible students.

31. Gwinnett County, near Atlanta, opened many school administrators' eyes to this possibility in 2004, when it brought in twenty-seven teachers from Hyderabad. In 2005 an examination board in England outsourced the grading of high-school achievement exams to workers in India.

32. For "high-altitude, low-opening" parachute jump. The jumpers leave the plane at 30,000 feet, free-fall for nearly two minutes, and open their chutes at 1,000 feet, a few seconds before impact. Because the airplanes are so high, they cannot be seen or heard from the ground; and the jumpers spend almost no time with their chutes visibly deployed.

33. In the spring of 2005 the Congressional Joint Committee on Taxation estimated that ending the estate tax would directly cut federal revenue by $72 billion in 2015. Other groups calculated that the total impact on the budget, including higher interest payments on a larger federal debt, would be $100 billion a year, or $1 trillion over a decade. All this tax relief flowed to the wealthiest one percent of Americans.

34. In 1990 the American aerospace industry employed 1,120,000 people. By 2004 that number had fallen by nearly half, to 593,000. During those same years the European aerospace industry was growing in both sales and work force. In 2003 Airbus overtook Boeing in world market share for commercial airliners.

35. In 2005 the American Society of Civil Engineers released a "report card" on the state of America's infrastructure—roads, dams, bridges, aviation, and so on. The overall grade was D, with the highest mark being C+, for solid-waste handling. According to the report, the most dramatic underinvestment involved the nation's roads. Simply maintaining the roads at the same level would cost $94 billion, the report said—or half again as much as actual yearly investment levels. Improving the roads would require about twice as much as the United States was spending.

36. In 2003 the city of Shanghai opened the world's fastest maglev line, whose trains average 267 miles per hour and arrive on schedule 99.7 percent of the time. An editor's note in the Journal of the American Society of Civil Engineers pointed out that half a dozen maglev proposals for American cities were "stalled in one stage or another of planning, permitting, or budgeting." The result, the journal's editor observed, was this: "Traffic congestion on U.S. roads worsens, energy prices fluctuate unpredictably, and, at least for the moment, China pulls ahead of the United States on the path to a safe, reliable, fast, and efficient means of transporting passengers."

37. Foreign enrollment in U.S. universities increased steadily from 1971 through 2002. It fell the next year, and has gone down ever since.

39. It's hard to remember or even to believe, but not that long ago the school system was a valuable social equalizer. More important, it was seen that way. Through the three golden decades, from the late 1940s (when the GI Bill kicked in) to the late 1970s (when Proposition 13 passed in California), the federal government and the states put more money than ever before into elementary schools, high schools, and universities. More students than ever before finished high school; more finished college; more felt they could go further than their parents had. Proposition 13 was the California ballot measure that cut property taxes by 30 percent and then capped their future growth. It prefigured the federal tax cuts of the early 2000s, because it pushed the level of revenue below its historic "band." Before Proposition 13 California's per capita spending on public schools was high, like Connecticut's or New York's. Twenty years later it was well below the national average, just ahead of Arkansas's.

40. In the early 2000s one third of American public high school students failed to graduate on time. Niels Christian Nielsen, a member of several corporate boards in Europe and the United States, said at the University of California in 2005, "The big difference between Europe and America is the proportion of people who come out of the system really not being functional for any serious role. In Finland that is maybe two or three percent. For Europe in general maybe fifteen or twenty. For the United States at least thirty percent, maybe more. In spite of all the press, Americans don't really get the education difference. They generally still feel this is a well-educated country and work force. They just don't see how far the country is falling behind."

About the Author

James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.

Most Popular

The legend of the Confederate leader’s heroism and decency is based in the fiction of a person who never existed.

The strangest part about the continued personality cult of Robert E. Lee is how few of the qualities his admirers profess to see in him he actually possessed.

Memorial Day has the tendency to conjure up old arguments about the Civil War. That’s understandable; it was created to mourn the dead of a war in which the Union was nearly destroyed, when half the country rose up in rebellion in defense of slavery. This year, the removal of Lee’s statue in New Orleans has inspired a new round of commentary about Lee, not to mention protests on his behalf by white supremacists.

The myth of Lee goes something like this: He was a brilliant strategist and devoted Christian man who abhorred slavery and labored tirelessly after the war to bring the country back together.

On August 21, the “moon” will pass between the Earth and the sun, obscuring the light of the latter. The government agency NASA says this will result in “one of nature’s most awe-inspiring sights.” The astronomers there claim to have calculated down to the minute exactly when and where this will happen, and for how long. They have reportedly known about this eclipse for years, just by virtue of some sort of complex math.

This seems extremely unlikely. I can’t even find these eclipse calculations on their website to check them for myself.

Meanwhile the scientists tell us we can’t look at it without special glasses because “looking directly at the sun is unsafe.”

Just seven months into his presidency, Trump appears to have achieved a status usually reserved for the final months of a term.

In many ways, the Trump presidency never got off the ground: The president’s legislative agenda is going nowhere, his relations with foreign leaders are frayed, and his approval rating with the American people never enjoyed the honeymoon period most newly elected presidents do. Pundits who are sympathetic toward, or even neutral on, the president keep hoping that the next personnel move—the appointment of White House Chief of Staff John Kelly, say, or the long-rumored-but-never-delivered departure of Steve Bannon—will finally get the White House in gear.

But what if they, and many other people, are thinking about it wrong? Maybe the reality is not that the Trump presidency has never gotten started. It’s that he’s already reached his lame-duck period. For most presidents, that comes in the last few months of a term. For Trump, it appears to have arrived early, just a few months into his term. The president did always brag that he was a fast learner.

An analysis of Stormfront forums shows a sometimes sophisticated understanding of the limits of ancestry tests.

The white-nationalist forum Stormfront hosts discussions on a wide range of topics, from politics to guns to The Lord of the Rings. And of particular and enduring interest: genetic ancestry tests. For white nationalists, DNA tests are a way to prove their racial purity. Of course, their results don’t always come back that way. And how white nationalists try to explain away non-European ancestry is rather illuminating of their beliefs.

Two years ago—before Donald Trump was elected president, before white nationalism had become central to the political conversation—Aaron Panofsky and Joan Donovan, sociologists then at the University of California, Los Angeles, set out to study Stormfront forum posts about genetic ancestry tests. They presented their study at the American Sociological Association meeting this Monday. (A preprint of the paper is now online.) After the events in Charlottesville this week, their research struck a particular chord with the audience.

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

Antifa’s activists say they’re battling burgeoning authoritarianism on the American right. Are they fueling it instead?

Since 1907, Portland, Oregon, has hosted an annual Rose Festival. Since 2007, the festival had included a parade down 82nd Avenue. Since 2013, the Republican Party of Multnomah County, which includes Portland, had taken part. This April, all of that changed.

In the days leading up to the planned parade, a group called the Direct Action Alliance declared, “Fascists plan to march through the streets,” and warned, “Nazis will not march through Portland unopposed.” The alliance said it didn’t object to the Multnomah GOP itself, but to “fascists” who planned to infiltrate its ranks. Yet it also denounced marchers with “Trump flags” and “red maga hats” who could “normalize support for an orange man who bragged about sexually harassing women and who is waging a war of hate, racism and prejudice.” A second group, Oregon Students Empowered, created a Facebook page called “Shut down fascism! No nazis in Portland!”

Anti-Semitic logic fueled the violence over the weekend, no matter what the president says.

The “Unite the Right” rally in Charlottesville was ostensibly about protecting a statue of Robert E. Lee. It was about asserting the legitimacy of “white culture” and white supremacy, and defending the legacy of the Confederacy.

So why did the demonstrators chant anti-Semitic lines like “Jews will not replace us”?

The demonstration was suffused with anti-black racism, but also with anti-Semitism. Marchers displayed swastikas on banners and shouted slogans like “blood and soil,” a phrase drawn from Nazi ideology. “This city is run by Jewish communists and criminal niggers,” one demonstrator told Vice News’ Elspeth Reeve during their march. As Jews prayed at a local synagogue, Congregation Beth Israel, men dressed in fatigues carrying semi-automatic rifles stood across the street, according to the temple’s president. Nazi websites posted a call to burn their building. As a precautionary measure, congregants had removed their Torah scrolls and exited through the back of the building when they were done praying.

If the president is concerned about violence on the left, he can start by fighting the white supremacist movements whose growth has fueled its rise.

In his Tuesday press conference, Donald Trump talked at length about what he called “the alt left.” White supremacists, he claimed, weren’t the only people in Charlottesville last weekend that deserved condemnation. “You had a group on the other side that was also very violent,” he declared. “Nobody wants to say that.”

I can say with great confidence that Trump’s final sentence is untrue. I can do so because the September issue of TheAtlantic contains an essay of mine entitled “The Rise of the Violent Left,” which discusses the very phenomenon that Trump claims “nobody wants” to discuss. Trump is right that, in Charlottesville and beyond, the violence of some leftist activists constitutes a real problem. Where he’s wrong is in suggesting that it’s a problem in any way comparable to white supremacism.

The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history.

When did America become untethered from reality?

I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books.