Latest from Politics

War now takes but a wave of the executive hand, but seems impossible to end. Congress authorized military action against Al Qaeda in 2001, and against the government of Iraqi dictator Saddam Hussein in 2002. Hussein is long gone, and Al Qaeda may or may not still exist; legally, however, these wars grind on, the longest-running pair in American history.

Thursday night, without a word to Congress or the public, the Trump Administration impulsively began a third conflict. Neither of the existing authorizations could even remotely be said to authorize Thursday's attack on Syria. We have no way of knowing whether it was a brief executive whim or the beginning of a third nightmare that will outlast the other two.

Here is President Trump's statement of the aims of the missile strike: "I call on all civilized nations to join us in seeking to end the slaughter and bloodshed in Syria, and also to end terrorism of all kinds and all types." His message to Congress is pure boilerplate: “I acted in the vital national security and foreign policy interests of the United States, pursuant to my constitutional authority as Commander-in-Chief and Chief Executive.”

I have no idea what policy in Syria would be effective in protecting that country's people from their murderous government, ending the country's murderous civil war, blocking increased influence by Iran and Russia in the Middle East, and stemming the rise of the Islamic State. I do not know which of the armed groups fighting against the government of Bashar al-Assad are fighting for something like democracy and which are seeking religious or ethnic dictatorships. I can't tell you how a global superpower can affect the political and military outcome of a multi-sided civil war without itself becoming involved in ground operations and even occupation—and dangerous conflict with other major powers.

Wise heads in the Pentagon and the National Security Council may actually have a plan; they may know friend from foe; they may have imagined the next step, and the next, wargaming intelligent American responses in the likely event that things in Syria do not immediately fall into place.

But I don't know. No one outside the small world of national security knows. The president has not even pretended to tell us whether he has any plan at all, and if so, what it is.

Indeed, from his statements on Thursday, it seems at least possible that, moved by the horrific video images of Syrian children suffocated by a gas attack, Trump simply decided to "do something" because, as he said in his statement, "No child of god should ever suffer such horror."

The impulse to act without thinking would be understandable. Careful planning and clarity of purpose aren't easy in a fast-moving international situation. But the framers of the Constitution constructed a mechanism to reduce or eliminate the danger that a president would take the nation to war on a fleeting impulse.

They gave the power to initiate war not to the president but to Congress.

Article I § 8 cl. 11 gives Congress the prerogative to "declare war." The powers surrounding that grant suggest that everything about the choice of war is a congressional power. Article II § 2 cl. 1 makes the president "commander in chief of the Army and Navy ... and of the militia”—but the context doesn't suggest that is the power to "command" the entire nation into war. In fact, the original wording of Article I gave Congress the entire power to "make war," with nothing given to the president. The Convention amended the words to "declare war," in order to give the president the narrow power to, in the words of Madison's Notes, "repel sudden attacks."

Roger Sherman of Connecticut objected that the wording gave the president too little power. He was rebuked by Eldridge Gerry of Massachusetts, Oliver Ellsworth of Massachusetts, and George Mason of Virginia, who all agreed that reposing the war choice in the executive would be dangerous.

"Mr. GERRY," the Notes record, "never expected to hear, in a republic, a motion to empower the Executive alone to declare war." Though often flouted by overreaching presidents and craven Congresses, that division of power remains the law of the land.

Unless the United States or its vital interests are under attack, the president must ask Congress before using military force against another country. The restriction may seem largely formal—the last time Congress actually turned down such a request was when the Senate blocked the Armed Ship Bill requested by President Woodrow Wilson in 1917. But it's not an empty formality. A president who wants to take the nation into war should be able to put the request in writing, explain it to the nation in his own words, and send diplomats and military leaders to lay out the policy behind the request, the scope of the conflict it is likely to produce, and the criteria by which the nation can measure success.

The action in Syria does not respond to "a sudden attack" on the United States. Nor can the Administration even try to hide behind the Obama-era dodge that it is not sending U.S. forces into "hostilities." If that rationalization ever served, it doesn't do so here. While the renewed use of chemical weapons in Syria may be an emergency, I haven't yet heard an explanation of why the United States cannot do what any prudent power would—follow its own laws and international law, persuade and inform its own people, seek the assent of the United Nations Security Council, consult its allies, and set forth its war aims.

In retrospect, it America's failure to intervene in Syria in 2013 was probably a historic mistake.

However harshly history may judge the default of 2013, it was a failure not of one leader but of national will.

It was not Barack Obama's mistake alone, but one made by the entire nation. Obama announced his plans but then asked Congress to approve them—precisely as the Constitution requires. His critics portray this request—the one time when Obama actually refused to violate the war powers system of the Constitution—as his personal folly. But members of both his own and the opposition parties in Congress made it known that they would not back his request; many elder statesman opposed the intervention; and national polls showed a solid majority of the public opposed.

After the apparent nerve gas attack last week, the current president issued a grotesquely improper official statement blaming Obama for Assad's recent crime. In 2013, however, private-citizen Trump had loudly and strenuously objected to Obama's planned intervention and demanded the president seek the approval of Congress.

However harshly history may judge the default of 2013, it was a failure not of one leader but of national will. Has the nation regained its will for a prolonged, bloody, and morally ambiguous struggle in Syria? Or will the United States drop the conflict after a few loud booms? That Donald Trump has, for the moment at least, changed his mind is not only of limited constitutional relevance—it also tells us less than nothing about whether the American people understand, accept, and embrace what may be needed for the intervention to succeed.

Harry S Truman was the first American president to commit to a major intervention—in Korea—without even the pretense of congressional approval. Because he had set forth no war aims, he could not resist Gen. Douglas MacArthur's pressure to blunder into war with China. Because he had not obtained Congressional assent, he found himself alone when the war turned dangerous.

If Truman were here to warn Trump, would he, or would anyone in 2017 Washington, even listen? They have not listened to Madison, or Gerry, or Ellsworth, or Mason.

Most Popular

The revolutionary ideals of Black Panther’s profound and complex villain have been twisted into a desire for hegemony.

The following article contains major spoilers.

Black Panther is a love letter to people of African descent all over the world. Its actors, its costume design, its music, and countless other facets of the film are drawn from all over the continent and its diaspora, in a science-fiction celebration of the imaginary country of Wakanda, a high-tech utopia that is a fictive manifestation of African potential unfettered by slavery and colonialism.

But it is first and foremost an African American love letter, and as such it is consumed with The Void, the psychic and cultural wound caused by the Trans-Atlantic slave trade, the loss of life, culture, language, and history that could never be restored. It is the attempt to penetrate The Void that brought us Alex Haley’s Roots, that draws thousands of African Americans across the ocean to visit West Africa every year, that left me crumpled on the rocks outside the Door of No Return at Gorée Island’s slave house as I stared out over a horizon that my ancestors might have traversed once and forever. Because all they have was lost to The Void, I can never know who they were, and neither can anyone else.

In Cyprus, Estonia, the United Arab Emirates, and elsewhere, passports can now be bought and sold.

“If you believe you are a citizen of the world, you are a citizen of nowhere. You don’t understand what citizenship means,” the British prime minister, Theresa May, declared in October 2016. Not long after, at his first postelection rally, Donald Trump asserted, “There is no global anthem. No global currency. No certificate of global citizenship. We pledge allegiance to one flag and that flag is the American flag.” And in Hungary, Prime Minister Viktor Orbán has increased his national-conservative party’s popularity with statements like “all the terrorists are basically migrants” and “the best migrant is the migrant who does not come.”

Citizenship and its varying legal definition has become one of the key battlegrounds of the 21st century, as nations attempt to stake out their power in a G-Zero, globalized world, one increasingly defined by transnational, borderless trade and liquid, virtual finance. In a climate of pervasive nationalism, jingoism, xenophobia, and ever-building resentment toward those who move, it’s tempting to think that doing so would become more difficult. But alongside the rise of populist, identitarian movements across the globe, identity itself is being virtualized, too. It no longer needs to be tied to place or nation to function in the global marketplace.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms.

A week after 17 people were murdered in a mass shooting at Marjory Stoneman Douglas High School in Parkland, Florida, teenagers across South Florida, in areas near Washington, D.C., and in other parts of the United States walked out of their classrooms to stage protests against the horror of school shootings and to advocate for gun law reforms. Student survivors of the attack at Marjory Stoneman Douglas High School traveled to their state Capitol to attend a rally, meet with legislators, and urge them to do anything they can to make their lives safer. These teenagers are speaking clearly for themselves on social media, speaking loudly to the media, and they are speaking straight to those in power—challenging lawmakers to end the bloodshed with their “#NeverAgain” movement.

Deputy Attorney General Ron Rosenstein flew to Seattle for a press conference at which he announced little, but may have said a great deal.

Back in the fall of 2001, exactly one month after the 9/11 attacks, a lawyer in Seattle named Tom Wales was murdered as he worked alone at his home computer at night. Someone walked into the yard of Wales’s house in the Queen Anne Hill neighborhood of Seattle, careful to avoid sensors that would have set off flood lights in the yard, and fired several times through a basement window, hitting Wales as he sat at his desk. Wales survived long enough to make a call to 911 and died soon afterwards. He was 49, divorced, with two children in their 20s.

The crime was huge and dismaying news in Seattle, where Wales was a prominent, respected, and widely liked figure. As a young lawyer in the early 1980s he had left a potentially lucrative path with a New York law firm to come to Seattle and work as an assistant U.S. attorney, or federal prosecutor. That role, which he was still performing at the time of his death, mainly involved prosecuting fraud cases. In his off-duty hours, Wales had become a prominent gun-control advocate. From the time of his death onward, the circumstances of the killing—deliberate, planned, nothing like a robbery or a random tragedy—and the prominence of his official crime-fighting record and unofficial advocacy role led to widespread assumption that his death was a retaliatory “hit.” The Justice Department considers him the first and only U.S. prosecutor to have been killed in the line of duty.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a “safe haven” law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family—nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Here are some readers with extra elements on this discussion—political, cultural, international. First, an American reader on the interaction of current concepts of masculinity and the nearly all-male population of mass gun murderers:

The path to its revival lies in self-sacrifice, and in placing collective interests ahead of the narrowly personal.

The death of liberalism constitutes the publishing world’s biggest mass funeral since the death of God half a century ago. Some authors, like conservative philosopher Patrick Deneen, of Why Liberalism Failed, have come to bury yesterday’s dogma. Others, like Edward Luce (The Retreat of Western Liberalism), Mark Lilla (The Once and Future Liberal), and Steven Levitsky and Daniel Ziblatt (How Democracies Die) come rather to praise. I’m in the latter group; the title-in-my-head of the book I’m now writing is What Was Liberalism.

But perhaps, like God, liberalism has been buried prematurely. Maybe the question that we should be asking is not what killed liberalism, but rather, what can we learn from liberalism’s long story of persistence—and how can we apply those insights in order to help liberalism write a new story for our own time.

Outside powers have been central to the nuclear crisis—but for a few peculiar weeks in February.

Of all the arguments in favor of allowing North Korea to leap into the spotlight with South Korea at the Winter Olympics—what with its deceptively smiley diplomats and even more smiley cheerleaders and the world’s most celebrated winless hockey team—one hasn’t received much attention. “It’s tragic that people of shared history, blood, language, and culture have been divided through geopolitics of the superpowers,” Talia Yoon, a resident of Seoul, toldThe New York Times when the paper asked South Koreans for their thoughts on the rapprochement between North and South Korea at the Olympics. “Neither Korea has ever been truly independent since the division.”

In this telling, having Korean athletes march under a unification flag at the Opening Ceremony and compete jointly in women’s hockey isn’t just about the practical goal of ensuring the Games aren’t disrupted by an act of North Korean aggression, or the loftier objective of seizing a rare opportunity for a diplomatic resolution to the escalating crisis over Kim Jong Un’s nuclear-weapons program. It’s also about Koreans—for a couple surreal weeks in February, at least—plucking some control over that crisis from the superpowers that have been so influential in shaping it over the past year.

Recognizing that Americans are not the future of his religion, the late preacher embraced “the black world, the white world, the yellow world, the rich world, the poor world.”

Billy Graham, who died Wednesday at the age of 99, may have been “America’s Pastor,” but he was also a man of the world. From the early days of his ministry, when he visited U.S. military forces in Korea, to his quiet message of healing at Washington Cathedral in the aftermath of September 11, Graham was a frequent commentator on—and participant in—global politics. He used his status as the most important American religious figure of the 20th century to help lead American evangelicals into a more robust engagement with the rest of the world. He was also an institution builder who was deeply invested in Christianity as a global faith.

There were other people who taught more missionaries, and some who reached more people on television; there were even those whose preaching events rivaled Graham’s in size. But no one else did as much to turn evangelicalism into an international movement that could stand alongside—and ultimately challenge—both the Vatican and the liberal World Council of Churches for the mantle of global Christian leadership.