Famed aviator Charles Lindbergh speaks at a rally against American involvement in World War II in October 1941. (Associated press)

As a quasi-socialist lefty who believes in gun regulation and health care for all, it's been more than a little upsetting the past week to realize that I am rooting for the Tea Party to stymie my president and hand him a humiliating foreign-policy defeat.

I’ve encountered this dilemma before. I thought seriously about voting for Ron Paul last year on the basis of his anti-war stance, for example, even though there's plenty of evidence that he has supported vicious racist propaganda in a manner which should disqualify him from being a dogcatcher, much less the nation’s chief executive. The far-right fringe in America holds many morally abominable views. But it also is the most influential political bloc willing to oppose our bipartisan consensus in favor of endless military intervention.

It would be comforting to think that this combination of anti-imperial force and ideological prejudice is an accidental blip, that Paul's racism could be pried free of his isolationism and I could sign on to the latter without worrying about the former. That's not the case, though. In his book War and the American Difference, Stanley Hauerwas traces America's tendency to link government military action and virtue to Union rhetoric about the Civil War.

Related Story

The Gettysburg Address, with its liturgical language of death as consecration, and its insistence that sacrifice of life requires further war in a holy cause, is the eloquent blueprint for all humanitarian intervention.* After Gettysburg, Hauerwas says, "American wars must be wars in which the sacrifices of those doing the dying and the killing have redemptive purpose and justification."

If Gettysburg makes intervention holy, then it follows that resistance to linking federal military action and virtue would have its origins in the Union's enemy. And indeed, the most passionate American arguments in favor of self-determination and against occupation come from the Confederacy, and from racist neo-Confederate myths about Reconstruction.

Walter Benn Michaels has pointed out that Thomas Dixon's The Clansman and Thomas Nelson Page's Red Rock are both explicitly anti-imperialist and explicitly racist. "Red Rock," Michaels writes in Our America, "tells the story of a conquered people, of how they survived under occupation, and of how they eventually 'reconquered' what it sometimes refers to as their 'country' and sometimes as their 'section.'" The conquered people are, of course, Southern whites, and the reconquest is a reimposition of brutal racial apartheid.

Given this history, the libertarian, anti-government thread of conservative isolationism starts to look more than a little repulsive. The liberal, federalist interventionists, like Wilson and FDR and LBJ, want to intervene on behalf of various non-white folks. The anti-interventionists (like, say, John Calhoun or Charles Lindbergh or David Duke) don't want to, because intervening on behalf of non-white folks is dangerous federal overreach. Ron Paul's racism and Ron Paul's isolationism aren't arbitrarily slapped together. They're two strands of a single, long-standing, and very unpleasant ideology.

So to avoid racism, should we just bomb everybody? Obviously, that's not logical. And part of the reason is that, in fact, federal interventionists have a checkered history as well. If there's a racist anti-imperialism, there's certainly a racist interventionism as well, as Rudyard Kipling declaimed while urging the U.S. to take up the white man's burden and intervene in the Philippines.

And so we did, involving ourselves in a long anti-insurgent campaign James Loewen has suggested served as a bloody, forgotten prototype for our racially tinged war in Vietnam. Woodrow Wilson, a kind of neo-Confederate himself, re-segregated the federal government and used his position to violate civil liberties with a paranoid thoroughness that even our post-9/11 presidents have failed to surpass. The sainted FDR had his homegrown concentration camps, while Truman had the bombing of Nagasaki, which Kurt Vonnegut called "the most racist, nastiest act by this country after human slavery."

So I'm left with two bad choices. I can advocate for the ugly tradition of American anti-imperialism. Or I can advocate for the ugly tradition of American intervention. On the one hand, insular nativism; on the other hand, racially tinged imperialism. And, on both hands, lots of blood.

There are certainly honorable strains in both the American interventionist and non-interventionist traditions as well, whether you draw the line from the Emancipation Proclamation to the liberation of the concentration camps, or from Thoreau's civil disobedience to the Vietnam War protests.

I think it's worth looking at the bad as well as the good, though. Being paralyzed with guilt isn't helpful, but maybe a little humility on all sides could be. Arguments around military action are often framed in binary terms — which is more righteous, to intervene or not to intervene? Given our particular history of intervention and non-intervention, though, though, one starts to wonder whether we have the moral standing to make either choice in an ethical way.

Maybe, rather than the constant question of how many more bombs we should pay for, and where we should drop them next, it might be worth it for us to try to think through what would we have to do, as a nation, or a people, or a community, to make our decisions about war and peace moral. As Hauerwas asked in his interview here last week, what would a just war Pentagon, look like? I don't know the answer to that — which is why, by default, I'm stuck rooting for the Tea Party to salvage our Syria policy. But I do know that, as it is, the world is entitled to look on our principled isolationism and our expansionist humanitarianism with equal mistrust.

* Correction: An earlier version of this post incorrectly referred here to the Emancipation Proclamation. We regret the error.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

“All the world has failed us,” a resident of the Syrian city of Aleppo told the BBC this week, via a WhatsApp audio message. “The city is dying. Rapidly by bombardment, and slowly by hunger and fear of the advance of the Assad regime.”

In recent weeks, the Syrian military, backed by Russian air power and Iran-affiliated militias, has swiftly retaken most of eastern Aleppo, the last major urban stronghold of rebel forces in Syria. Tens of thousands of besieged civilians are struggling to survive and escape the fighting, amid talk of a rebel retreat. One of the oldest continuously inhabited cities on earth, the city of the Silk Road and the Great Mosque, of muwashshah and kibbeh with quince, of the White Helmets and Omran Daqneesh, is poised to fall to Bashar al-Assad and his benefactors in Moscow and Tehran, after a savage four-year stalemate. Syria’s president, who has overseen a war that has left hundreds of thousands of his compatriots dead, will inherit a city robbed of its human potential and reduced to rubble.

Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.

It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.

They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.

A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.

A recent study shows that people who simply ate more fiber lost about as much weight as those who went on a complicated diet.

By this time of year, many peoples’ best-laid New Year’s Resolutions have died, just seven short weeks after they were born. One reason why it’s difficult to lose weight—the most common resolution—is that dieting is so confusing.

For instance, the American Heart Association's recommended diet is one of the most effective food plans out there. It’s also one of the most complicated. It requires, according to a recent study, “consuming vegetables and fruits; eating whole grains and high-fiber foods; eating fish twice weekly; consuming lean animal and vegetable proteins; reducing intake of sugary beverages; minimizing sugar and sodium intake; and maintaining moderate to no alcohol intake.” On top of that, adherents should derive half of their calories from carbs, a fifth from protein, and the rest from fat—except just 7 percent should be saturated fat. (Perhaps the goal is to keep people busy doing long division so they don't have time to eat food.)