Francis Fukuyama ("The Great Disruption," May Atlantic) remarks that "in fact the great American postwar crime wave began in a period of full employment and general prosperity." Precisely. Prosperity results in modernization, and modernization has various consequences. One of those consequences was the rise of a new urban underclass.

Our urban underclass consists of displaced southern peasants and their descendants. In 1945 much of southern agriculture, notably cotton, was still dependent on hand labor, mediated by the sharecropper-tenancy system. In Alvin Toffler's terms, these workers were still in the first wave, rather than the second. Depression and war had forced southern agriculture to defer modernization for twenty years. In the postwar period southern landowners had the money for new equipment, especially combine cotton harvesters. Southern landlords evicted their tenants, destroyed the tenants' houses, and bought new combine harvesters.

The refugees of this displacement process arrived in northern cities only to find that northern industrialists had also been buying labor-saving machinery and that the existing industrial labor force had first claim on whatever jobs were available. Eventually some industrial jobs opened up, when the northern working class no longer wanted them (having sent their children to school to become policemen, teachers, and so forth), but that took time. In the meantime, many of the displaced peasants turned to crime and violence.

Many European countries (and Japan) have taken the stance that allowing people to evict peasants is simply not profitable from a societal point of view. Peasants can adapt to the city only if they are allowed to arrive at their own pace, with the option of returning to the farm if urban life does not agree with them. Even when European peasants did not own land, they were protected by appropriate legislation. Of course, often they did own land, because as early as the sixteenth century central governments began enforcing rent control, eventually expropriating the landlord in favor of the peasant.

Andrew D. Todd

In "The Great Disruption," Francis Fukuyama writes, "A dynamic, technologically innovative economy will by its very nature disrupt existing social relations." I find this statement difficult to reconcile with his claim that "social order, once disrupted, tends to get remade," and even more so with his optimistic assertion that such a process of social regeneration may have already begun.

The technological explosion to which Fukuyama attributes the rampant individualism at the heart of our social disorder is the product of neo-liberal democracy and market capitalism, both imbued by Fukuyama with an aura of Hegelian inevitability, and both totally committed to the excessive individualism he deplores. Clearly, it is not in theirnature to encourage the "trade-off between personal freedom and community" that he feels is so essential to rehumanizing society, so one is at a loss to identify the source of his optimism.

Fukuyama fails to consider that the collective psyche of the late twentieth century has been so deeply imprinted with the ideology of individualism that even those who are conscious of the need for trade-offs between personal freedom and community tend to back away when faced with the price to be paid for their professed ideals. This is the real victory of late capitalism over the hearts and minds of its adherents.

Until Fukuyama addresses this collective psychological fact, his belief that the Great Disruption may have run its course will remain little more than wishful thinking.

Howard Bluth

Sometimes one article is worth the year's subscription price. For me it is "The Great Disruption."

Richard E. Appel

Andrew Todd points to the migration of rural blacks to northern cities as the source of what I labeled the "Great Disruption" -- that is, the descent into social anarchy and family breakdown that emerged in many inner cities by the 1980s. But poor blacks were being drawn north, as Nicholas Lemann showed in his book The Promised Land, by plentiful low-skill jobs in the 1940s and 1950s; intense social pathology didn't emerge for another fifteen to twenty years. I believe that it is less the migration itself than the disappearance of these jobs in the 1970s that is at the root of the problem, exacerbated by a welfare system that rewarded single-parent families.

Howard Bluth seems to think that intensive individualism is the product of a certain type of late-twentieth-century "neo-liberal"capitalism. In fact it is deeply embedded in the whole Western Enlightenment, which has been ongoing for at least the past three centuries, and is part and parcel of the individual freedoms we all enjoy. As I explain at greater length in the book on which my Atlantic article was based, capitalism both destroys and creates social capital, and on balance is, as Adam Smith argued, a moralizing force in commercial societies. But the ultimate reason for my optimism is that we human beings by our very natures don't like individualism carried to extremes, and tend to moderate it by creating new social rules that bind us in communities. Whether I'm right this time around, only time will tell.

I am, of course, very grateful that Richard Appel feels he got his money's worth in subscribing to The Atlantic.

I was very disappointed in Francis Davis's "Napoleon in Rags" (May Atlantic). I am simply not interested in hearing why the music someone loves (often the music with which he or she came of age) is more profound or meaningful than someone else's music.

Davis's article presents opinions as facts and is both unattractive and misguided. An example: Davis says "As songwriters, John Lennon and Paul McCartney were surprisingly traditional; they brought a youthful cheekiness to pop, but in terms of lyrical sentiment and melodic structure their songs of the sixties merely updated Tin Pan Alley conventions." Very few composers or musicologists would agree with such an irresponsible and inaccurate statement, because, of all things, the Beatles were extremely original in their progressions.

Alexander Wood

Francis Davis refers in passing to "Positively 4th Street" as "a song from Highway 61 Revisited," the 1965 album often praised as Dylan's masterwork. It isn't.

Glenn Hughes

A couple of factual notes: The song "Positively 4th Street" is not on Highway 61 Revisited. Also, the quote from "Like a Rolling Stone" isn't quite correct -- it's "Napoleon in rags and the language that he used."

Stephen Wacker

Perhaps believing "Tin Pan Alley" to be a pejorative, Alexander Wood misunderstands what I said about the Beatles, whose music I preferred to Dylan's when I was a teenager and still do. Unlike Dylan's quasi-folk ballads and blues, Lennon and McCartney's early numbers were in the same 32-bar format as Jerome Kern's and Irving Berlin's, but with an updated beat. Their chord progressions were sometimes "original," if by this Wood means inventive, though not necessarily more inventive than Kern's or Berlin's.

Glenn Hughes and Stephen Wacker are right about "Positively 4th Street," which was recorded at one of the sessions for Highway 61 Revisited but did not appear on that LP. Dylan often changes the lyrics to his songs in performance; to my ears, it sounds like he sings "the language that he'd use"on both the 1965 hit of "Like a Rolling Stone" and the version from Manchester, England, on Live 1966.

"There was in fact a great deal at stake for Britain in the Great War," Benjamin Schwarz writes ("Was the Great War Necessary?," May Atlantic.) "Even assuming a benevolent German order on the Continent, the result of Germany's victory would have been that British independence as a great power would have been greatly diminished." But Britain ostensibly "won" the war, and its independence as a great power was greatly diminished in any case.

Britain's efforts from September of 1939 to 1942 to contend with the German and the Japanese armed forces are laughable -- whether we are talking about France or Singapore. Britain was saved from German conquest in the Second World War by the Americans and the Soviet Union. The fact of the matter is this: Britain in August of 1914 had already entered a steep descent toward being a second-rate power, whether economically or militarily. But the British government and the majority of Britain's ruling class could not face the reality of this decline, because then they would have had to reflect on the reasons for this descent: principally an unwillingness to absorb and utilize modern technology, an educational system backward at all levels, and the physical weakness and illiteracy of Britain's working class. War seemed a good alternative to facing these realities. Glory could cover up reason.

could have prevented Britain's decline in the twentieth century relative to Germany, the United States, and Russia. But the country's social and political leaders would not face this cruel fact, which indicated plainly that nothing was to be gained from war with Germany in 1914, and much was to be lost. Niall Ferguson is right.

Norman F. Cantor

In Benjamin Schwarz's review of Niall Ferguson's The Pity of War one looks in vain for a discussion of the major deficiency of the book. It treats the reasons the British had for entering the First World War as though these were the only important things to examine when one considers the casualties incurred, without ever considering the incompetence of Britain's generals and the political choices made in choosing those generals and keeping them in charge.

It is as though a seriously ill person were hurt in an auto accident while being driven to the hospital by a reckless driver. One cannot examine only whether one would have been better off staying at home; one should consider whether one could have chosen a better driver.

The reckless drivers for Britain in the First World War were its generals. Hundreds of thousands of men were senselessly slaughtered in obedience to a reckless macho doctrine called "élan vital." Few of the commanding generals acted on the self-evident fact that charging bravely back and forth across strategically worthless land will not win a war against shells, machine guns, and poison gas.

Stephen E. Adler

I agree with Norman Cantor's assertions about pre-war Britain's weaknesses. But his letter mixes apples and oranges. The historians who have most carefully diagnosed those weaknesses (Correlli Barnett, whom I discuss in my article, and Paul Kennedy) disagree with him and agree with me about what was at stake for Britain in the Great War. Certainly Britain's relative decline was inevitable; that does not mean that it would have been wise of British statesmen to stand aside and accept the dominance of Europe by a single power. Statesmanship, after all, is about managing"inevitable" changes in the international system. Britain found itself in such a terrible predicament during the Second World War because its interwar leaders failed to manage their country's relative decline, not because its leaders in 1914 wouldn't place their country's security and prosperity at the sufferance of Germany. Of course, Britain's independence has been greatly diminished since the Second World War; but thanks to British statesmen on the eve of the First World War, London has been under the sway of a (relatively) benign hegemon across the Atlantic Ocean rather than an unpredictable and strident one across the English Channel. Finally, although Professor Cantor is a fine medieval historian, his suggestion that Britain's leaders conspiratorially committed their country to war to "cover up" internal weaknesses is unsupported by any historical evidence.

If Stephen Adler's assessment of the peculiar incompetence of British generals were correct, how do we explain the fact that Britain suffered far fewer casualties than Germany, France, Russia, and Austro-Hungary? From 1914 to 1918 the British army learned at hideous cost a new kind of warfare that baffled even experienced Continental armies. Incompetent generals there certainly were, as there are in every war. But the terrible truth -- the tragedy -- of the Great War was that the wealth that was available to the great powers for war-making, together with the state of military technology, meant that no masterstroke of strategy would swiftly or at low cost terminate the struggle.

In the May issue of The Atlantic, William Aron, William Burke, and Milton Freeman argue that whales can be killed under hunting restrictions for years to come ("Flouting the Convention"). Yes, they can, but should they be killed at all? On purely sentimental grounds I stand with the whales in the debate. I draw support from the voice of the Irish storyteller Sean O'Faolain, who reminds us that "we are for a great part of our lives at the mercy of uncharted currents of the heart."

Whales are awesome. A sperm whale dives a mile deep in the sea and holds its breath for more than an hour. The blue whale (the largest mammal) outweighs the pygmy shrew (the smallest) by a factor of ninety million, yet the two have similar tissues and organs, and both, I presume, nurse their young with a certain tenderness. Whales live in families and play in the moonlight; they talk to one another in distress. They are more complete and successful in their world than we are in ours. They deserve to be known and cherished, not for their potential as meatballs but as a collective inspiration for humankind. The thought of managing them for their spiritual value alone seems far more civilized than is the thought of managing them to satisfy a tiny fraction of the world's insatiable demand for marine products of commerce.

Victor B. Scheffer

We cannot accept the view that caring about human beings is antagonistic to being concerned for plants and animals. It is our responsibility to see that our ecosystem is preserved. Implicit in this is a recognition of the importance of all life forms, including those used for food as well as those enjoyed for their beauty or their role in preserving biological balance. A healthy and inspiring world, however, requires more than biological diversity; it is also strongly dependent on cultural diversity.

We plead for understanding of those cultures different from our own. We share with Victor Scheffer his awe for whales. They are magnificent. He clearly states that his views are sentimental; he is a distinguished marine scientist, but his cultural perspective on whales, too, merits respect. Such sentiments, in our view, do not outweigh the needs of other cultures and communities whose ongoing dependence on whaling can be demonstrated. As long as they conduct their food-getting activities in a nonwasteful, ecologically sound, and lawful manner, we cannot believe there is any legitimate basis for thinking that the world is too small to accommodate such cultural differences.

The fact that people may choose to eat a particular animal does not imply disrespect or even indifference. In fact, many hunters speak of a special closeness to the animals they hunt -- something that many nonhunters find hard to understand.

As someone who makes a living growing and selling apples, I read with some concern your June Almanac item on food [regarding the level of pesticide residues in certain fruits]. A typical consumer might conclude that eating U.S. produce poses a serious health risk. Far from it. Using a broad brush to paint a negative picture, and inflammatory statements such as "potentially unhealthy to children" and "domestic produce in general is more contaminated than imported," the author uses tabloid tactics to alarm, rather than inform, the public.

Most Popular

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer.

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer. Most of the U.S. territory currently has no electricity or running water, fewer than 250 of the island’s 1,600 cellphone towers are operational, and damaged ports, roads, and airports are slowing the arrival and transport of aid. Communication has been severely limited and some remote towns are only now being contacted. Jenniffer Gonzalez, the Resident Commissioner of Puerto Rico, told the Associated Press that Hurricane Maria has set the island back decades.

A small group of programmers wants to change how we code—before catastrophe strikes.

There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.

The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.

The greatest threats to free speech in America come from the state, not from activists on college campuses.

The American left is waging war on free speech. That’s the consensus from center-left to far right; even Nazis and white supremacists seek to wave the First Amendment like a bloody shirt. But the greatest contemporary threat to free speech comes not from antifa radicals or campus leftists, but from a president prepared to use the power and authority of government to chill or suppress controversial speech, and the political movement that put him in office, and now applauds and extends his efforts.

The most frequently cited examples of the left-wing war on free speech are the protests against right-wing speakers that occur on elite college campuses, some of which have turned violent.New York’s Jonathan Chait has described the protests as a “war on the liberal mind” and the “manifestation of a serious ideological challenge to liberalism—less serious than the threat from the right, but equally necessary to defeat.” Most right-wing critiques fail to make such ideological distinctions, and are far more apocalyptic—some have unironically proposed state laws that define how universities are and are not allowed to govern themselves in the name of defending free speech.

A growing body of research debunks the idea that school quality is the main determinant of economic mobility.

One of the most commonly taught stories American schoolchildren learn is that of Ragged Dick, Horatio Alger’s 19th-century tale of a poor, ambitious teenaged boy in New York City who works hard and eventually secures himself a respectable, middle-class life. This “rags to riches” tale embodies one of America’s most sacred narratives: that no matter who you are, what your parents do, or where you grow up, with enough education and hard work, you too can rise the economic ladder.

A body of research has since emerged to challenge this national story, casting the United States not as a meritocracy but as a country where castes are reinforced by factors like the race of one’s childhood neighbors and how unequally income is distributed throughout society. One such study was published in 2014, by a team of economists led by Stanford’s Raj Chetty. After analyzing federal income tax records for millions of Americans, and studying, for the first time, the direct relationship between a child’s earnings and that of their parents, they determined that the chances of a child growing up at the bottom of the national income distribution to ever one day reach the top actually varies greatly by geography. For example, they found that a poor child raised in San Jose, or Salt Lake City, has a much greater chance of reaching the top than a poor child raised in Baltimore, or Charlotte. They couldn’t say exactly why, but they concluded that five correlated factors—segregation, family structure, income inequality, local school quality, and social capital—were likely to make a difference. Their conclusion: America is land of opportunity for some. For others, much less so.

One hundred years ago, a retail giant that shipped millions of products by mail moved swiftly into the brick-and-mortar business, changing it forever. Is that happening again?

Amazon comes to conquer brick-and-mortar retail, not to bury it. In the last two years, the company has opened 11 physical bookstores. This summer, it bought Whole Foods and its 400 grocery locations. And last week, the company announced a partnership with Kohl’s to allow returns at the physical retailer’s stores.

Why is Amazon looking more and more like an old-fashioned retailer? The company’s do-it-all corporate strategy adheres to a familiar playbook—that of Sears, Roebuck & Company. Sears might seem like a zombie today, but it’s easy to forget how transformative the company was exactly 100 years ago, when it, too, was capitalizing on a mail-to-consumer business to establish a physical retail presence.

The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.

It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 17.

National Geographic Magazine has opened its annual photo contest for 2017, with the deadline for submissions coming up on November 17. The Grand Prize Winner will receive $10,000 (USD), publication in National Geographic Magazine and a feature on National Geographic’s Instagram account. The folks at National Geographic were, once more, kind enough to let me choose among the contest entries so far for display here. The captions below were written by the individual photographers, and lightly edited for style.

What the Trump administration has been threatening is not a “preemptive strike.”

Donald Trump lies so frequently and so brazenly that it’s easy to forget that there are political untruths he did not invent. Sometimes, he builds on falsehoods that predated his election, and that enjoy currency among the very institutions that generally restrain his power.

That’s the case in the debate over North Korea. On Monday, The New York Timesdeclared that “the United States has repeatedly suggested in recent months” that it “could threaten pre-emptive military action” against North Korea. On Sunday, The Washington Post—after asking Americans whether they would “support or oppose the U.S. bombing North Korean military targets” in order “to get North Korea to give up its nuclear weapons”—announced that “Two-thirds of Americans oppose launching a preemptive military strike.” Citing the Post’s findings, The New York Times the same day reported that Americans are “deeply opposed to the kind of pre-emptive military strike” that Trump “has seemed eager to threaten.”

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

The president has sided with Luther Strange in the primary matchup, a candidate who has lagged behind his challenger Roy Moore in polling.

Alabama Republicans head to the polls on Tuesday in a special election primary. The race pits former state attorney general Luther Strange against former judge Roy Moore in a fight for the Republican nomination in the race for the Senate seat vacated by Attorney General Jeff Sessions. Polls close at 8 p.m. EST.

Strange has President Trump’s endorsement and has benefited from millions of dollars in spending from political groups aligned with Senate Majority Leader Mitch McConnell. Even so, Strange, who was temporarily appointed to the Senate seat in February by then-Alabama Governor Robert Bentley, has trailed in the polls, lagging behind his challenger. Moore is a conservative firebrand who was removed from the Alabama Supreme Court in 2003 after refusing to move a monument to the Ten Commandments from the state judicial building.