For years, the residents of Oxford, Massachusetts, seethed with anger at the company that controlled the local water supply. The company, locals complained, charged inflated prices and provided terrible service. But unless the town’s residents wanted to get by without running water, they had to pay up, again and again.

The people of Oxford resolved to buy the company out. At a town meeting in the local high-school auditorium, an overwhelming majority of residents voted to raise the millions of dollars that would be required for the purchase. It took years, but in May 2014, the deal was nearly done: One last vote stood between the small town and its long-awaited goal.

The company, however, was not going down without a fight. It mounted a campaign against the buyout. On the day of the crucial vote, the high-school auditorium swelled to capacity. Locals who had toiled on the issue for years noticed many newcomers—residents who hadn’t showed up to previous town meetings about the buyout. When the vote was called, the measure failed—the company, called Aquarion, would remain the town’s water supplier. Supporters of the buyout mounted a last-ditch effort to take a second vote, but before it could be organized, a lobbyist for Aquarion pulled a fire alarm. The building had to be evacuated, and the meeting adjourned. Aquarion retains control of Oxford’s water system to this day.

The company denied that the lobbyist was acting on its behalf when he pulled the alarm; it also denies that its rates were abnormally high or that it provides poor service. Some Oxford residents supported Aquarion, and others opposed the buyout because they feared the cost and complication of the town running its own water company. But many residents, liberal and conservative, were frustrated by the process. The vote, they felt, hadn’t taken place on a level playing field.

“It was a violation of the sanctity of our local government by big money,” Jen Caissie, a former chairman of the board of selectmen in Oxford, told me. “Their messiah is their bottom line, not the health of the local community. And I say that as a Republican, someone who is in favor of local business.”

A New England town meeting would seem to be one of the oldest and purest expressions of the American style of government. Yet even in this bastion of deliberation and direct democracy, a nasty suspicion had taken hold: that the levers of power are not controlled by the people.

“The preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy.”

It’s a suspicion stoked by the fact that, across a range of issues, public policy does not reflect the preferences of the majority of Americans. If it did, the country would look radically different: Marijuana would be legal and campaign contributions more tightly regulated; paid parental leave would be the law of the land and public colleges free; the minimum wage would be higher and gun control much stricter; abortions would be more accessible in the early stages of pregnancy and illegal in the third trimester.

The subversion of the people’s preferences in our supposedly democratic system was explored in a 2014 study by the political scientists Martin Gilens of Princeton and Benjamin I. Page of Northwestern. Four broad theories have long sought to answer a fundamental question about our government: Who rules? One theory, the one we teach our children in civics classes, holds that the views of average people are decisive. Another theory suggests that mass-based interest groups such as the AARP have the power. A third theory predicts that business groups such as the Independent Insurance Agents and Brokers of America and the National Beer Wholesalers Association carry the day. A fourth theory holds that policy reflects the views of the economic elite.

Gilens and Page tested those theories by tracking how well the preferences of various groups predicted the way that Congress and the executive branch would act on 1,779 policy issues over a span of two decades. The results were shocking. Economic elites and narrow interest groups were very influential: They succeeded in getting their favored policies adopted about half of the time, and in stopping legislation to which they were opposed nearly all of the time. Mass-based interest groups, meanwhile, had little effect on public policy. As for the views of ordinary citizens, they had virtually no independent effect at all. “When the preferences of economic elites and the stands of organized interest groups are controlled for, the preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy,” Gilens and Page wrote.

Outlets from The Washington Post to Breitbart News cited this explosive finding as evidence of what overeager headline writers called American oligarchy. Subsequent studies critiqued some of the authors’ assumptions and questioned whether the political system is quite as insulated from the views of ordinary people as Gilens and Page found. The most breathless claims made on the basis of their study were clearly exaggerations. Yet their work is another serious indication of a creeping democratic deficit in the land of liberty.

From Our March 2018 Issue

Subscribe to The Atlantic and support 160 years of independent journalism

To some degree, of course, the unresponsiveness of America’s political system is by design. The United States was founded as a republic, not a democracy. As Alexander Hamilton and James Madison made clear in the Federalist Papers, the essence of this republic would consist—their emphasis—“IN THE TOTAL EXCLUSION OF THE PEOPLE, IN THEIR COLLECTIVE CAPACITY, from any share” in the government. Instead, popular views would be translated into public policy through the election of representatives “whose wisdom may,” in Madison’s words, “best discern the true interest of their country.” That this radically curtailed the degree to which the people could directly influence the government was no accident.

Only over the course of the 19th century did a set of entrepreneurial thinkers begin to dress an ideologically self-conscious republic up in the unaccustomed robes of a democracy. Throughout America, the old social hierarchies were being upended by rapid industrialization, mass immigration, westward expansion, and civil war. Egalitarian sentiment was rising. The idea that the people should rule came to seem appealing and even natural. The same institutions that had once been designed to exclude the people from government were now commended for facilitating government “of the people, by the people, for the people.”

The shifting justification for our political system inspired important reforms. In 1913, the Seventeenth Amendment stipulated that senators had to be elected directly by the people, not by state legislatures. In 1920, the Nineteenth Amendment gave women the vote. In 1965, the Voting Rights Act, drawing on the Fifteenth Amendment, set out to protect the vote of black Americans. The once-peculiar claim that the United States was a democracy slowly came to have some basis in reality.

That basis is now crumbling, and the people have taken notice. In no small part that’s because the long era during which average Americans grew more wealthy has come to a sputtering stop. People who are asked how well they are doing economically frequently compare their own standard of living with that of their parents. Until recently, this comparison was heartening. At the age of 30, more than nine in 10 Americans born in 1940 were earning more than their parents had at the same stage of their lives. But according to eye-popping research led by the economist Raj Chetty and his co-authors, many Millennials do not share in this age-old American experience of improving fortunes. Among those Americans born in the early 1980s, only half earn more than their parents did at a similar age.

Americans have never loved their politicians or thought of Washington as a repository of moral virtue. But so long as the system worked for them—so long as they were wealthier than their parents had been and could expect that their kids would be better off than them—people trusted that politicians were ultimately on their side. Not anymore.

The rise of digital media, meanwhile, has given ordinary Americans, especially younger ones, an instinctive feel for direct democracy. Whether they’re stuffing the electronic ballot boxes of The Voice and Dancing With the Stars, liking a post on Facebook, or up-voting a comment on Reddit, they are seeing what it looks like when their vote makes an immediate difference. Compared with these digital plebiscites, the work of the United States government seems sluggish, outmoded, and shockingly unresponsive.

As a result, average voters feel more alienated from traditional political institutions than perhaps ever before. When they look at decisions made by politicians, they don’t see their preferences reflected in them. For good reason, they are growing as disenchanted with democracy as the people of Oxford, Massachusetts, did.

The politician who best intuited this discontent—and most loudly promised to remedy it—is Donald Trump. The claim that he would channel the voice of the people to combat a corrupt and unresponsive elite was at the very core of his candidacy. “I am your voice,” Trump promised as he accepted his party’s nomination at the Republican National Convention. “Today, we are not merely transferring power from one administration to another or from one party to another,” he proclaimed in his inaugural address, “but we are transferring power from Washington, D.C., and giving it back to you, the people.”

Donald Trump won the presidency for many reasons, including racial animus, concerns over immigration, and a widening divide between urban and rural areas. But public-opinion data suggest that a deep feeling of powerlessness among voters was also important. I analyzed 2016 data from the American National Election Studies. Those who voted for Trump in the Republican primaries, more than those who supported his competition, said that they “don’t have any say about what the government does,” that “public officials don’t care much what people like me think,” and that “most politicians care only about the interests of the rich and powerful.”

Trump has no real intention of devolving power back to the people. He’s filled his administration with members of the same elite he disparaged on the campaign trail. His biggest legislative success, the tax bill, has handed gifts to corporations and the donor class. A little more than a year after America rebelled against political elites by electing a self-proclaimed champion of the people, its government is more deeply in the pockets of lobbyists and billionaires than ever before.

It would be easy to draw the wrong lesson from this: If the American electorate can be duped by a figure like Trump, it can’t be trusted with whatever power it does retain. To avoid further damage to the rule of law and the rights of the most-vulnerable Americans, traditional elites should appropriate even more power for themselves. But that response plays into the populist narrative: The political class dislikes Trump because he threatens to take its power away. It also refuses to recognize that the people have a point.

America does have a democracy problem. If we want to address the root causes of populism, we need to start by taking an honest accounting of the ways in which power has slipped out of the people’s hands, and think more honestly about the ways in which we can—and cannot—put the people back in control.

Matt Dorfman

At the height of the Mexican–American War, Nicholas Trist traveled to Mexico and negotiated the Treaty of Guadalupe Hidalgo, which ended the hostilities between the two nations and helped delineate America’s southern border. Two decades later, the U.S. government still hadn’t paid him for his services. Too old and weak to travel to Washington to collect the money himself, Trist hired a prominent lawyer by the name of Linus Child to act on his behalf, promising him 25 percent of his recovered earnings.

Congress finally appropriated the money to settle its debt. But now it was Trist who refused to pay up, even after his lawyer sued for his share. Though the contract between Trist and Child hardly seems untoward by today’s standards, the Supreme Court refused to uphold it out of fear that it might provide a legal basis for the activities of lobbyists:

If any of the great corporations of the country were to hire adventurers who make market of themselves in this way, to procure the passage of a general law with a view to the promotion of their private interests, the moral sense of every right-minded man would instinctively denounce the employer and employed as steeped in corruption.

Extreme as this case may appear, it was far from idiosyncratic. In her book Corruption in America, the legal scholar Zephyr Teachout notes that the institutions of the United States were explicitly designed to counter the myriad ways in which people might seek to sway political decisions for their own personal gain. Many forms of lobbying were banned throughout the 19th century. In Georgia, the state constitution at one time read that “lobbying is declared to be a crime.” In California, it was a felony.

Over the course of the 20th century, lobbying gradually lost the stench of the illicit. But even once the activity became normalized, businesses remained reluctant to exert their influence. As late as the 1960s, major corporations did not lobby directly on their own behalf. Instead, they relied on collectives such as the U.S. Chamber of Commerce, which had a weaker voice in Washington than labor unions or public-interest groups. “As every business executive knows,” the future Supreme Court Justice Lewis F. Powell Jr. complained in 1971, “few elements of American society today have as little influence in government as the American businessman.”

“I always knew the system was dysfunctional,” said Congressman Steve Israel. “Now it is beyond broken.”

All of this began to change in the early 1970s. Determined to fight rising wages and stricter labor and environmental standards, which would bring higher costs, CEOs of companies like General Electric and General Motors banded together to expand their power on Capitol Hill. At first, their activities were mostly defensive: The goal was to stop legislation that might harm their interests. But as the political influence of big corporations grew, and their profits soared, a new class of professional lobbyists managed to convince the nation’s CEOs that, in the words of Lee Drutman, the author of the 2015 book The Business of America Is Lobbying, their activity “was not just about keeping the government far away—it could also be about drawing government close.”

Today, corporations wield immense power in Washington: “For every dollar spent on lobbying by labor unions and public-interest groups,” Drutman shows, “large corporations and their associations now spend $34. Of the 100 organizations that spend the most on lobbying, 95 consistently represent business.” (Read about a principal architect of the lobbying industry—Paul Manafort—in our March 2018 cover story.)

The work of K Street lobbyists, and the violation of our government by big money, has fundamentally transformed the work—and the lives—of the people’s supposed representatives. Steve Israel, a Democratic congressman from Long Island, was a consummate moneyman. Over the course of his 16 years on Capitol Hill, he arranged 1,600 fund-raisers for himself, averaging one every four days. Israel cited fund-raising as one of the main reasons he decided to retire from Congress, in 2016: “I don’t think I can spend another day in another call room making another call begging for money,” he told The New York Times. “I always knew the system was dysfunctional. Now it is beyond broken.”

A model schedule for freshman members of Congress prepared a few years ago by the Democratic Congressional Campaign Committee instructs them to spend about four hours every day cold-calling donors for cash. The party encourages so many phone calls because the phone calls work. Total spending on American elections has grown to unprecedented levels. From 2000 to 2012, reported federal campaign spending doubled. It’s no surprise, then, that a majority of Americans now believe Congress to be corrupt, according to a 2015 Gallup poll. As Israel memorably put it to HBO’s John Oliver, the hours he had spent raising money had been “a form of torture—and the real victims of this torture have become the American people, because they believe that they don’t have a voice in this system.”

Big donors and large corporations use their largesse to sway political decisions. But their influence goes far beyond those instances in which legislators knowingly sacrifice their constituents’ interests to stay on the right side of their financial backers. The people we spend time with day in and day out shape our tastes, our assumptions, and our values. The imperative to raise so much money means that members of Congress log more time with donors and lobbyists and less time with their constituents. Often, when faced with a vote on a bill of concern to their well-heeled backers, legislators don’t have to compromise their ideals—because they spend so much of their lives around donors and lobbyists, they have long ago come to share their views.

The problem goes even deeper than that. In America’s imagined past, members of Congress had a strong sense of place. Democrats might have risen through the ranks of local trade unions or schoolhouses. Republicans might have been local business or community leaders. Members of both parties lived lives intertwined with those of their constituents. But spend some time reading the biographies of your representatives in Congress, and you’ll notice, as I did, that by the time they reach office, many politicians have already been socialized into a cultural, educational, and financial elite that sets them apart from average Americans. While some representatives do have strong roots in their district, for many others the connection is tenuous at best. Even for those members who were born and raised in the part of the country they represent, that place is for many of them not their true home. Educated at expensive colleges, likely on the coasts, they spend their 20s and 30s in the nation’s great metropolitan centers. After stints in law, business, or finance, or on Capitol Hill, they move to the hinterlands out of political ambition. Once they retire from Congress, even if they retain some kind of home in their district, few make it the center of their lives: They seem much more likely than their predecessors to pursue lucrative opportunities in cities such as New York, San Francisco, and, of course, Washington. By just about every metric—from life experience to education to net worth—these politicians are thoroughly disconnected from the rest of the population.

The massive influence that money yields in Washington is hardly a secret. But another, equally important development has largely gone ignored: More and more issues have simply been taken out of democratic contestation.

In many policy areas, the job of legislating has been supplanted by so-called independent agencies such as the Federal Communications Commission, the Securities and Exchange Commission, the Environmental Protection Agency, and the Consumer Financial Protection Bureau. Once they are founded by Congress, these organizations can formulate policy on their own. In fact, they are free from legislative oversight to a remarkable degree, even though they are often charged with settling issues that are not just technically complicated but politically controversial.

The range of crucial issues that these agencies have taken on testifies to their importance. From banning the use of the insecticide DDT to ensuring the quality of drinking water, for example, the EPA has been a key player in fights about environmental policy for almost 50 years; more recently, it has also made itself central to the American response to climate change, regulating pollutants and proposing limits on carbon-dioxide emissions from new power plants.

While independent agencies occasionally generate big headlines, they often wield their real power in more obscure policy areas. They are now responsible for the vast majority of new federal regulations. A 2008 article in the California Law Review noted that, during the previous year, Congress had enacted 138 public laws. In the same year, federal agencies had finalized 2,926 rules. Such rules run the gamut from technical stipulations that affect only a few specialized businesses to substantial reforms that have a direct impact on the lives of millions. In October 2017, for example, the Consumer Financial Protection Bureau passed a rule that would require providers of payday loans to determine whether customers would actually be able to pay them back—potentially saving millions of people from exploitative fees, but also making it more difficult for them to access cash in an emergency.

The rise of independent agencies such as the EPA is only a small piece of a larger trend in which government has grown less accountable to the people. In the latter half of the 20th century, the Federal Reserve won much greater independence from elected politicians and began to deploy far more powerful monetary tools. Trade treaties, from nafta to more-recent agreements with countries such as Australia, Morocco, and South Korea, have restricted Congress’s ability to set tariffs, subsidize domestic industries, and halt the inflow of certain categories of migrant workers. At one point I planned to count the number of treaties to which the United States is subject; I gave up when I realized that the State Department’s “List of Treaties and Other International Agreements of the United States” runs to 551 pages.

Most of these treaties and agreements offer real benefits or help us confront urgent challenges. Whatever your view of their merit, however, there is no denying that they curtail the power of Congress in ways that also disempower American voters. Trade treaties, for example, can include obscure provisions about “investor–state dispute settlements,” which give international arbitration courts the right to award huge sums of money to corporations if they are harmed by labor or environmental standards—potentially making it riskier for Congress to pass such measures.

This same tension between popular sovereignty and good governance is also evident in the debates over the power of the nine unelected justices of the Supreme Court. Since the early 1950s, the Supreme Court has ended legal segregation in schools and universities. It has ended and then reintroduced the death penalty. It has legalized abortion. It has limited censorship on television and the radio. It has decriminalized homosexuality and allowed same-sex marriage. It has struck down campaign-finance regulations and gun-control measures. It has determined whether millions of people get health insurance and whether millions of undocumented immigrants need to live in fear of being deported.

Whether you see judicial review as interpreting the law or usurping the people’s power probably depends on your view of the outcome. The American right has long railed against “activist judges” while the American left, which enjoyed a majority on the Court for a long stretch during the postwar era, has claimed that justices were merely doing their job. Now that the Court has started to lean further right, these views are rapidly reversing. But regardless of your politics, there’s no question that the justices frequently play an outsize role in settling major political conflicts—and that many of their decisions serve to amplify undemocratic elements of the system.

Take Citizens United. By overturning legislation that restricted campaign spending by corporations and other private groups, the Supreme Court issued a decision that was unpopular at the time and has remained unpopular since. (In a 2015 poll by Bloomberg, 78 percent of respondents disapproved of the ruling.) It also massively amplified the voice of moneyed interest groups, making it easier for the economic elite to override the preferences of the population for years to come.

Donald Trump is the firstpresident in the history of the United States to have served in no public capacity before entering to the White House. He belittles experts, seems to lack the most basic grasp of public policy, and loves to indulge the worst whims of his supporters. In all things, personal and political, Plato’s disdainful description of the “democratic man” fits the 45th president like a glove: Given to “false and braggart words and opinions,” he considers “insolence ‘good breeding,’ license ‘liberty,’ prodigality ‘magnificence,’ and shamelessness ‘manly spirit.’ ”

It is little wonder, then, that Plato’s haughty complaint about democracy—its primary ill, he claimed, consists in “assigning a kind of equality indiscriminately to equals and unequals alike”—has made a remarkable comeback. As early as 2003, the journalist Fareed Zakaria argued, “There can be such a thing as too much democracy.” In the years since, many scholars have built this case: The political scientist Larry Bartels painstakingly demonstrated just how irrational ordinary voters are; the political philosopher Jason Brennan turned the premise that irrational or partisan voters are terrible decision makers into a book titled Against Democracy; and Parag Khanna, an inveterate defender of globalization, argued for a technocracy in which many decisions are made by “committees of accountable experts.” Writing near the end of the 2016 primary season, when Trump’s ascent to the Republican nomination already looked unstoppable, Andrew Sullivan offered the most forceful distillation of this line of antidemocratic laments: “Democracies end when they are too democratic,” the headline of his essay announced. “And right now, America is a breeding ground for tyranny.”

The antidemocratic view gets at something real. What makes our political system uniquely legitimate, at least when it functions well, is that it manages to deliver on two key values at once: liberalism (the rule of law) and democracy (the rule of the people). With liberalism now under concerted attack from the Trump administration, which has declared war on independent institutions such as the FBI and has used the president’s pulpit to bully ethnic and religious minorities, it’s perhaps understandable that many thinkers are willing to give up a modicum of democracy to protect the rule of law and the country’s most vulnerable groups.

If only it were that easy. As we saw in 2016, the feeling that power is slipping out of their hands makes citizens more, not less, likely to entrust their fate to a strongman leader who promises to smash the system. And as the examples of Egypt, Thailand, and other countries have demonstrated again and again, a political elite with less and less backing from the people ultimately has to resort to more and more repressive steps to hold on to its power; in the end, any serious attempt to sacrifice democracy in order to safeguard liberty is likely to culminate in an end to the rule of law as well as the rule of the people.

The easy alternative is to lean in the other direction, to call for as much direct democracy as possible. The origins of the people’s displacement, the thinking goes, lie in a cynical power grab by financial and political elites. Large corporations and the superrich advocated independent central banks and business-friendly trade treaties to score big windfalls. Politicians, academics, and journalists favor a technocratic mode of governance because they think they know what’s best and don’t want the people to meddle. All of this selfishness is effectively cloaked in a pro-market ideology propagated by think tanks and research outfits that are funded by rich donors. Since the roots of the current situation are straightforwardly sinister, the solutions to it are equally simple: The people need to reclaim their power—and abolish technocratic institutions.

This antitechnocratic view has currency on both ends of the political spectrum. On the far left, the late political scientist Peter Mair, writing about Europe, lamented the decline in “popular” democracy, which he contrasted with a more top-down “constitutional” democracy. The English sociologist Colin Crouch has argued that even anarchy and violence can serve a useful purpose if they seek to vanquish what he calls “post-democracy.”

The far right puts more emphasis on nationalism, but otherwise agrees with this basic analysis. In the inaugural issue of the journal American Affairs, the self-styled intellectual home of the Trump movement, its founder Julius Krein decried “the existence of a transpartisan elite,” which sustains a pernicious “managerial consensus.” Steve Bannon, the former White House chief strategist, said his chief political objective was to return power to the people and advocated for the “deconstruction of the administrative state.”

Mair and Crouch, Krein and Bannon are right to recognize that the people have less and less hold over the political system, an insight that can point the way to genuine reforms that would make our political system both more democratic and better functioning. One of the reasons well-intentioned politicians are so easily swayed by lobbyists, for example, is that their staffs lack the skills and experience to draft legislation or to understand highly complex policy issues. This could be addressed by boosting the woefully inadequate funding of Congress: If representatives and senators were able to attract—and retain—more knowledgeable and experienced staffers, they might be less tempted to let K Street lobbyists write their bills for them.

Similarly, the rules that currently govern conflicts of interest are far too weak. There is no reason members of Congress should be allowed to lobby for the companies they were supposed to regulate so soon after they step down from office. It is time to jam the revolving door between politics and industry.

Real change will also require an ambitious reform of campaign finance. Because of Citizens United, this is going to be extremely difficult. But the Supreme Court has had a change of heart in the past. As evidence that the current system threatens American democracy keeps piling up, the Court might finally recognize that stricter limits on campaign spending are desperately needed.

For all that the enemies of technocracy get right, though, their view is ultimately as simplistic as the antidemocratic one. The world we now inhabit is extremely complex. We need to monitor hurricanes and inspect power plants, reduce global carbon emissions and contain the spread of nuclear weapons, regulate banks and enforce consumer-safety standards. All of these tasks require a tremendous amount of expertise and a great degree of coordination. It’s unrealistic to think that ordinary voters or even their representatives in Congress might become experts in what makes for a safe power plant, or that the world could find an effective response to climate change without entering cumbersome international agreements. If we simply abolish technocratic institutions, the future for most Americans will look more rather than less dangerous, and less rather than more affluent.

Related Stories

It is true that to recover its citizens’ loyalty, our democracy needs to curb the power of unelected elites who seek only to pad their influence and line their pockets. But it is also true that to protect its citizens’ lives and promote their prosperity, our democracy needs institutions that are, by their nature, deeply elitist. This, to my mind, is the great dilemma that the United States—and other democracies around the world—will have to resolve if they wish to survive in the coming decades.

We don’t need to abolish all technocratic institutions or merely save the ones that exist. We need to build a new set of political institutions that are both more responsive to the views and interests of ordinary people, and better able to solve the immense problems that our society will face in the decades to come.

Writing about the dawn of democracy in his native Italy, the great novelist Giuseppe Tomasi di Lampedusa has Tancredi, a young aristocrat, recognize that he will have to let go of some of his most cherished habits to rescue what is most valuable in the old order: “If everything is to stay the same,” Tancredi says, “everything has to change.” The United States is now at an inflection point of its own. If we rigidly hold on to the status quo, we will lose what is most valuable in the world we know, and find ourselves cast as bit players in the fading age of liberal democracy. Only by embarking on bold and imaginative reform can we recover a democracy worthy of the name.

Most Popular

The revolutionary ideals of Black Panther’s profound and complex villain have been twisted into a desire for hegemony.

The following article contains major spoilers.

Black Panther is a love letter to people of African descent all over the world. Its actors, its costume design, its music, and countless other facets of the film are drawn from all over the continent and its diaspora, in a science-fiction celebration of the imaginary country of Wakanda, a high-tech utopia that is a fictive manifestation of African potential unfettered by slavery and colonialism.

But it is first and foremost an African American love letter, and as such it is consumed with The Void, the psychic and cultural wound caused by the Trans-Atlantic slave trade, the loss of life, culture, language, and history that could never be restored. It is the attempt to penetrate The Void that brought us Alex Haley’s Roots, that draws thousands of African Americans across the ocean to visit West Africa every year, that left me crumpled on the rocks outside the Door of No Return at Gorée Island’s slave house as I stared out over a horizon that my ancestors might have traversed once and forever. Because all they have was lost to The Void, I can never know who they were, and neither can anyone else.

In Cyprus, Estonia, the United Arab Emirates, and elsewhere, passports can now be bought and sold.

“If you believe you are a citizen of the world, you are a citizen of nowhere. You don’t understand what citizenship means,” the British prime minister, Theresa May, declared in October 2016. Not long after, at his first postelection rally, Donald Trump asserted, “There is no global anthem. No global currency. No certificate of global citizenship. We pledge allegiance to one flag and that flag is the American flag.” And in Hungary, Prime Minister Viktor Orbán has increased his national-conservative party’s popularity with statements like “all the terrorists are basically migrants” and “the best migrant is the migrant who does not come.”

Citizenship and its varying legal definition has become one of the key battlegrounds of the 21st century, as nations attempt to stake out their power in a G-Zero, globalized world, one increasingly defined by transnational, borderless trade and liquid, virtual finance. In a climate of pervasive nationalism, jingoism, xenophobia, and ever-building resentment toward those who move, it’s tempting to think that doing so would become more difficult. But alongside the rise of populist, identitarian movements across the globe, identity itself is being virtualized, too. It no longer needs to be tied to place or nation to function in the global marketplace.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms. Student survivors of the attack at Marjory Stoneman Douglas High School traveled to their state Capitol to attend a rally, meet with legislators, and urge them to do anything they can to make their lives safer. These teenagers are speaking clearly for themselves on social media, speaking loudly to the media, and they are speaking straight to those in power—challenging lawmakers to end the bloodshed with their “#NeverAgain” movement.

Deputy Attorney General Ron Rosenstein flew to Seattle for a press conference at which he announced little, but may have said a great deal.

Back in the fall of 2001, exactly one month after the 9/11 attacks, a lawyer in Seattle named Tom Wales was murdered as he worked alone at his home computer at night. Someone walked into the yard of Wales’s house in the Queen Anne Hill neighborhood of Seattle, careful to avoid sensors that would have set off flood lights in the yard, and fired several times through a basement window, hitting Wales as he sat at his desk. Wales survived long enough to make a call to 911 and died soon afterwards. He was 49, divorced, with two children in their 20s.

The crime was huge and dismaying news in Seattle, where Wales was a prominent, respected, and widely liked figure. As a young lawyer in the early 1980s he had left a potentially lucrative path with a New York law firm to come to Seattle and work as an assistant U.S. attorney, or federal prosecutor. That role, which he was still performing at the time of his death, mainly involved prosecuting fraud cases. In his off-duty hours, Wales had become a prominent gun-control advocate. From the time of his death onward, the circumstances of the killing—deliberate, planned, nothing like a robbery or a random tragedy—and the prominence of his official crime-fighting record and unofficial advocacy role led to widespread assumption that his death was a retaliatory “hit.” The Justice Department considers him the first and only U.S. prosecutor to have been killed in the line of duty.

The president’s son is selling luxury condos and making a foreign-policy speech.

Who does Donald Trump Jr. speak for?

Does the president’s son speak for the Trump Organization as he promotes luxury apartments in India? Does he speak for himself when he dines with investors in the projects? Does he speak for the Trump administration as he makes a foreign-policy speech in Mumbai on Friday?

“When these sons go around all over the world talking about, one, Trump business deals and, two, … apparently giving speeches on some United States government foreign policy, they are strongly suggesting a linkage between the two,” Richard Painter, President George W. Bush’s chief ethics lawyer who is a professor of law at the University of Minnesota, told me. “Somebody, somewhere is going to cross the line into suggesting a quid pro quo.”

On Tuesday, the district attorney in Durham, North Carolina, dismissed all remaining charges in the August case. What does that mean for the future of statues around the country?

DURHAM, N.C.—“Let me be clear, no one is getting away with what happened.”

That was Durham County Sheriff Mike Andrews’s warning on August 15, 2017. The day before, a protest had formed on the lawn outside the county offices in an old courthouse. In more or less broad daylight, some demonstrators had leaned a ladder against the plinth, reading, “In memory of the boys who wore the gray,” and looped a strap around it. Then the crowd pulled down the statue, and it crumpled cheaply on the grass. It was a brazen act, witnessed by dozens of people, some of them filming on cell phones.

Andrews was wrong. On Tuesday, a day after a judge dismissed charges against two defendants and acquitted a third, Durham County District Attorney Roger Echols announced the state was in effect surrendering, dismissing charges against six other defendants.

The path to its revival lies in self-sacrifice, and in placing collective interests ahead of the narrowly personal.

The death of liberalism constitutes the publishing world’s biggest mass funeral since the death of God half a century ago. Some authors, like conservative philosopher Patrick Deneen, of Why Liberalism Failed, have come to bury yesterday’s dogma. Others, like Edward Luce (The Retreat of Western Liberalism), Mark Lilla (The Once and Future Liberal), and Steven Levitsky and Daniel Ziblatt (How Democracies Die) come rather to praise. I’m in the latter group; the title-in-my-head of the book I’m now writing is What Was Liberalism.

But perhaps, like God, liberalism has been buried prematurely. Maybe the question that we should be asking is not what killed liberalism, but rather, what can we learn from liberalism’s long story of persistence—and how can we apply those insights in order to help liberalism write a new story for our own time.

A new study finds that many household goods degrade air quality more than once thought.

On the final day of April 2010, unbeknownst to most locals, a small fleet of specialists and equipment from the U.S. government descended on the seas and skies around Los Angeles.

A “Hurricane Hunter” Lockheed P-3 flew in from Denver. The U.S. Navy vessel Atlantis loitered off the coast of Santa Monica. Orbiting satellites took special measurements. And dozens of scientists set up temporary labs across the basin, in empty Pasadena parking lots and at the peak of Mount Wilson.

This was all part of a massive U.S. government study with an ambitious goal: Measure every type of gas or chemical that wafted by in the California air.

Jessica Gilman, a research chemist at the National Oceanic and Atmospheric Administration, was one member of the invading horde. For six weeks, she monitored one piece of equipment—a kind of “souped-up, ruggedized” instrument—as it sat outside in Pasadena, churning through day and night, measuring the amount of chemicals in the air. It was designed to detect one type of air pollutant in particular: volatile organic compounds, or VOCs. VOCs are best known for their presence in car exhaust, but they are also found in gases released by common household products, like cleaners, house paints, and nail polish.

Outside powers have been central to the nuclear crisis—but for a few peculiar weeks in February.

Of all the arguments in favor of allowing North Korea to leap into the spotlight with South Korea at the Winter Olympics—what with its deceptively smiley diplomats and even more smiley cheerleaders and the world’s most celebrated winless hockey team—one hasn’t received much attention. “It’s tragic that people of shared history, blood, language, and culture have been divided through geopolitics of the superpowers,” Talia Yoon, a resident of Seoul, toldThe New York Times when the paper asked South Koreans for their thoughts on the rapprochement between North and South Korea at the Olympics. “Neither Korea has ever been truly independent since the division.”

In this telling, having Korean athletes march under a unification flag at the Opening Ceremony and compete jointly in women’s hockey isn’t just about the practical goal of ensuring the Games aren’t disrupted by an act of North Korean aggression, or the loftier objective of seizing a rare opportunity for a diplomatic resolution to the escalating crisis over Kim Jong Un’s nuclear-weapons program. It’s also about Koreans—for a couple surreal weeks in February, at least—plucking some control over that crisis from the superpowers that have been so influential in shaping it over the past year.