How will we all keep busy when we only have to work 15 hours a week? That was the question that worried the economist John Maynard Keynes when he wrote his short essay “Economic Possibilities for Our Grandchildren” in 1930. Over the next century, he predicted, the economy would become so productive that people would barely need to work at all.

For a while, it looked like Keynes was right: In 1930 the average workweek was 47 hours. By 1970 it had fallen to slightly less than 39.

But then something changed. Instead of continuing to decline, the duration of the workweek stayed put; it’s hovered just below 40 hours for nearly five decades.

So what happened? Why are people working just as much today as in 1970?

Related Story

There would be no mystery in this if Keynes had been wrong about the economy’s increasing productivity, which he thought would lead to a standard of living “between four and eight times as high as it is today.” But Keynes got that right: Technology has made the economy massively more productive. According to Benjamin M. Friedman, an economist at Harvard, “the U.S. economy is right on track to reach Keynes’s eight-fold multiple” by 2029—100 years after the last data Keynes would have had. (Keynes did not specify what he meant by a “standard of life,” so Friedman uses per-capita output as a proxy.)

In a new paper, Friedman tries to figure out why that increased productivity has not translated into increased leisure time. Perhaps people just never feel materially satisfied, always wanting more money for the next new thing. “This argument is, at best, far from sufficient,” he writes. If that were the case, why did the duration of the workweek decline in the first place?

Another theory Friedman considers is that “in an era of ever fewer settings that provide effective opportunities for personal connections and relationships,” people may place more value on the socializing that happens at work. But the evidence for this “remains uneven at best,” and, once again, “its bearing on the abrupt change in trend in the U.S. workweek in the 1970s is far from established.”

A third possibility proves more convincing: American inequality means that the gains of increasing productivity are not widely shared. In other words, most Americans are too poor to work less. Unlike the other two explanations Friedman considers, this one fits chronologically: Inequality declined in America during the post-war period (along with the duration of the workweek), but since the early 1970s it’s risen dramatically.

Keynes’s prediction rests on the idea that “standard of life” would continue rising for everyone. But Friedman says that’s not what has happened: Although Keynes’s eight-fold figure holds up for the economy in aggregate, it’s not at all the case for the median American worker. For them, output by 2029 is likely to be around 3.5 times what it was when Keynes was writing—a bit below his four- to-eight-fold predicted range.

This can be seen in the median worker’s income over this time period, complete with a shift in 1973 that fits in precisely with when the workweek stopped shrinking. According to Friedman, “Between 1947 and 1973 the average hourly wage for nonsupervisory workers in private industries other than agriculture (restated in 2013 dollars) nearly doubled, from $12.27 to $21.23—an average growth rate of 2.1 percent per annum. But by 2013 the average hourly wage was only $20.13—a 5 percent fall from the 1973 level.” For most people, then, the magic of increasing productivity stopped working around 1973, and they had to keep working just as much in order to maintain their standard of living.

What Keynes foretold was a very optimistic version of what economists call technological unemployment—the idea that less labor will be necessary because machines can do so much. In Keynes’s vision, the resulting unemployment would be distributed more or less evenly across society in the form of increased leisure.

Friedman says that reality comports more with a darker version of technological unemployment: It’s not unemployment per se, but a soft labor market in which millions of people are “desperately seeking whatever low-wage work [they] can get.” This is corroborated by a recent poll by Marketplace that found that for half of hourly workers, their top concern isn’t that they work too much but that they work too little—not, presumably, because they like their jobs so much, but because they need the money.

This explanation leaves an important question: If the very rich—the workers who have reaped above-average gains from the increased productivity since Keynes’s time—can afford to work less, why don’t they? I asked Friedman about this and he theorized that for many top earners, work is a labor of love. They are doing work they care about and are interested in, and doing more of it isn’t such a burden—it may even be a pleasure. They derive meaning from their jobs, and it is an important part of how they think of themselves. And, of course, they are compensated for it at a level that makes it worth their while.

The prosperity Keynes predicted is here. After all, the economy as a whole has grown even more brilliantly than he expected. But for most Americans, that prosperity is nowhere to be seen—and, as a result, neither are those shorter workweeks.

Most Popular

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer.

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer. Most of the U.S. territory currently has no electricity or running water, fewer than 250 of the island’s 1,600 cellphone towers are operational, and damaged ports, roads, and airports are slowing the arrival and transport of aid. Communication has been severely limited and some remote towns are only now being contacted. Jenniffer Gonzalez, the Resident Commissioner of Puerto Rico, told the Associated Press that Hurricane Maria has set the island back decades.

A small group of programmers wants to change how we code—before catastrophe strikes.

There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.

The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.

The greatest threats to free speech in America come from the state, not from activists on college campuses.

The American left is waging war on free speech. That’s the consensus from center-left to far right; even Nazis and white supremacists seek to wave the First Amendment like a bloody shirt. But the greatest contemporary threat to free speech comes not from antifa radicals or campus leftists, but from a president prepared to use the power and authority of government to chill or suppress controversial speech, and the political movement that put him in office, and now applauds and extends his efforts.

The most frequently cited examples of the left-wing war on free speech are the protests against right-wing speakers that occur on elite college campuses, some of which have turned violent.New York’s Jonathan Chait has described the protests as a “war on the liberal mind” and the “manifestation of a serious ideological challenge to liberalism—less serious than the threat from the right, but equally necessary to defeat.” Most right-wing critiques fail to make such ideological distinctions, and are far more apocalyptic—some have unironically proposed state laws that define how universities are and are not allowed to govern themselves in the name of defending free speech.

A growing body of research debunks the idea that school quality is the main determinant of economic mobility.

One of the most commonly taught stories American schoolchildren learn is that of Ragged Dick, Horatio Alger’s 19th-century tale of a poor, ambitious teenaged boy in New York City who works hard and eventually secures himself a respectable, middle-class life. This “rags to riches” tale embodies one of America’s most sacred narratives: that no matter who you are, what your parents do, or where you grow up, with enough education and hard work, you too can rise the economic ladder.

A body of research has since emerged to challenge this national story, casting the United States not as a meritocracy but as a country where castes are reinforced by factors like the race of one’s childhood neighbors and how unequally income is distributed throughout society. One such study was published in 2014, by a team of economists led by Stanford’s Raj Chetty. After analyzing federal income tax records for millions of Americans, and studying, for the first time, the direct relationship between a child’s earnings and that of their parents, they determined that the chances of a child growing up at the bottom of the national income distribution to ever one day reach the top actually varies greatly by geography. For example, they found that a poor child raised in San Jose, or Salt Lake City, has a much greater chance of reaching the top than a poor child raised in Baltimore, or Charlotte. They couldn’t say exactly why, but they concluded that five correlated factors—segregation, family structure, income inequality, local school quality, and social capital—were likely to make a difference. Their conclusion: America is land of opportunity for some. For others, much less so.

One hundred years ago, a retail giant that shipped millions of products by mail moved swiftly into the brick-and-mortar business, changing it forever. Is that happening again?

Amazon comes to conquer brick-and-mortar retail, not to bury it. In the last two years, the company has opened 11 physical bookstores. This summer, it bought Whole Foods and its 400 grocery locations. And last week, the company announced a partnership with Kohl’s to allow returns at the physical retailer’s stores.

Why is Amazon looking more and more like an old-fashioned retailer? The company’s do-it-all corporate strategy adheres to a familiar playbook—that of Sears, Roebuck & Company. Sears might seem like a zombie today, but it’s easy to forget how transformative the company was exactly 100 years ago, when it, too, was capitalizing on a mail-to-consumer business to establish a physical retail presence.

The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.

It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 17.

National Geographic Magazine has opened its annual photo contest for 2017, with the deadline for submissions coming up on November 17. The Grand Prize Winner will receive $10,000 (USD), publication in National Geographic Magazine and a feature on National Geographic’s Instagram account. The folks at National Geographic were, once more, kind enough to let me choose among the contest entries so far for display here. The captions below were written by the individual photographers, and lightly edited for style.

What the Trump administration has been threatening is not a “preemptive strike.”

Donald Trump lies so frequently and so brazenly that it’s easy to forget that there are political untruths he did not invent. Sometimes, he builds on falsehoods that predated his election, and that enjoy currency among the very institutions that generally restrain his power.

That’s the case in the debate over North Korea. On Monday, The New York Timesdeclared that “the United States has repeatedly suggested in recent months” that it “could threaten pre-emptive military action” against North Korea. On Sunday, The Washington Post—after asking Americans whether they would “support or oppose the U.S. bombing North Korean military targets” in order “to get North Korea to give up its nuclear weapons”—announced that “Two-thirds of Americans oppose launching a preemptive military strike.” Citing the Post’s findings, The New York Times the same day reported that Americans are “deeply opposed to the kind of pre-emptive military strike” that Trump “has seemed eager to threaten.”

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

Senators Lindsey Graham and Bill Cassidy sparred with Bernie Sanders and Amy Klobuchar on CNN hours after their bill dismantling Obamacare appeared to collapse.

Ordinarily, you debate to stave off defeat. But for Senators Lindsey Graham and Bill Cassidy on Monday night, the defeat came first.

By the time the two GOP senators stepped on CNN’s stage Monday night for a prime-time debate over their health-care proposal, they knew they had already lost.

A few hours earlier, Senator Susan Collins became the third Republican to formally reject the pair’s legislation to repeal and replace the Affordable Care Act, effectively killing its chances for passage through the Senate this week. Graham and Cassidy had hoped to use the forum to make a closing argument for their plan, and to line it up against Senator Bernie Sanders and his call for a single-payer, “Medicare-for-All” health-care system. Instead, the two senators found themselves defending a proposal that was no less hypothetical—and probably much less popular—than Sanders’s supposed liberal fantasy.