Battling these instances of ceremonial deism may hurt the case against truly theocratic gestures

Reuters/Mark Wallheiser

The Supreme Court has, not surprisingly, declined to reinstate a lawsuit
challenging the constitutionality of adding the phrase "so help me God" to the
presidential oath of office and including prayers in the inaugural
ceremony. The suit, Newdow v Roberts,brought
by Michael Newdow, the American Humanist Association's legal center, and a long list of non-theist organizations and individuals, was
dismissed for lack of standing by the D.C. Court of Appeals. The
plaintiffs never did secure a hearing on the merits of their claim. They
would have lost on the merits, but their argument is worth examining
anyway. It exemplifies what advocates of official religiosity have
powerfully derided as an "offended observer" claim.

Citing the
unique importance of inaugurations -- "the grandest ceremonies in our
national existence" -- plaintiffs assert that they "have a right to view
their government in action without being forced to confront official
endorsements of religious dogma with which they disagree ... Prayers
that declare there is a God exclude Atheistic Americans ... making them
feel like 'outsiders' due to their personal religious beliefs." (Never
mind that plaintiff's argument is based on their lack of religious
beliefs.) They claim to have been injured by "being personally
compelled," as the price of viewing the inauguration, "to endure
government sponsored, clergy led prayer to a (Christian) God," and "to
witness the Chief Justice, without any authority, alter the presidential
oath ... so that it includes the purely religious phrase, 'so help me
God.'" Some non-theists "have felt compelled" not to view the
inauguration in order to protect themselves and their children from its
religious rituals.

I'd draw the line at agreeing to tolerate publicly funded nativity scenes, Ten Commandment displays, or official sectarian prayers at football games, graduations, and school-board meetings, among other practices.

Strictly, constitutionally speaking, I agree
with the plaintiffs that official pleas or shows of obeisance to a Deity
violate prohibitions on establishing religion. But I consider the
inaugural violations minimal and tolerable. Maybe I'm just resigned to
them, but I'm grateful not to be burdened by the sensibilities that make
my co-irreligionists feel oppressed by some ceremonial God-talk. Thanks
to my insensitivity, their recitations of injuries (about
government coercion and the necessity of "enduring" official prayers in
order to view the inauguration or avoiding the inauguration to avoid
enduring the prayers) sound a bit melodramatic to me.

This
question of whether occasional instances of ceremonial deism inflict
cognizable injuries on non-believers is not just part of the substantive
constitutional challenge in Newdow v Roberts. It's the core
standing question raised by the "offended observer" doctrine in
establishment clause challenges to official religious symbols or
practices. By attacking offended-observer standing, opponents of separating
church and state reduce pleas to curb majoritarian religious
establishment in the interest of minority rights into complaints about
political correctness. It's a promising strategy. Even I might agree
that trivial instances of ceremonial deism, like robotic references to
God, are relatively innocuous (and arguably degrade religious belief
nearly as much as they stigmatize atheists). The trouble is that, in
some cases, plaintiffs are being denied standing as "offended observers"
to challenge official deism that isn't merely ceremonial.

Imagine
a continuum between ceremonial deism and theocracy. "In God We Trust"
on dollar bills clearly falls on the insignificant, ceremonial side,
while sectarian prayers at a public, governmental meeting belong on the
theocratic side, as courts sometimes agree. In 2006, the Fifth Circuit
Court of Appeals granted an injunction against Louisiana's Tangipahoa school board, which opens its public meetings with a prayer offered by
someone chosen by the board. "The prayers often include references to
'Jesus' and 'Jesus Christ,'" according to Americans United.
"Indeed, the School Board, by a vote of 9-0, rejected a proposal to
limit the prayers to a 'brief non-sectarian, non-proselytizing
invocation.'"

A three-judge panel of the Fifth Circuit enjoined this
intentionally sectarian practice that the school board had affirmatively
declined to correct. But the full court disagreed, without reaching the
merits. In a divided ruling,
it denied plaintiffs standing to challenge the prayers, finding no
evidence in the record that they had attended school board meetings at
which the prayers were offered. The dissent disagreed on the facts,
asserting that plaintiff's presence at board meetings had been conceded.

But should the question of their attendance have mattered?
Shouldn't any citizen taxpayer have standing to challenge an arguably
unconstitutional exercise of official sectarianism? Not according to the
Supreme Court. In a 2007 case, Hein v Freedom From Religion Foundation,
involving the Bush administration's funding of sectarian social
services, the Court held that we have no standing, as mere taxpayers, to
challenge sectarian religious actions by the executive branch. If the
Court eventually decides that we also lack standing as "offended
observers" in establishment clause cases, it will be difficult for
anyone who isn't actually forced to engage in official prayers or other
forms of worship to challenge state-sponsored, (taxpayer-funded)
sectarianism.

Or so the opponents of church/state separation may hope. David French, a formidable conservative civil libertarian
(with whom I agree on many issues not involving official establishment
of religion) characterizes
offended-observer standing as "pernicious ... one of the most divisive
and biased procedural elements yet devised in constitutional law. Under
this unique doctrine, a few plaintiffs can undo decades of community
consensus and rip apart long-standing traditions -- not because they
have been coerced in any way but simply because they were 'offended'
when they heard a public prayer or saw a nativity scene on public land."

I
understand French's exasperation. As long as people are free to look
away from offensive religious displays, why should they be empowered to
deprive their neighbors of whatever comfort the displays provide? But
you might also ask why members of majority faiths should be empowered to
impose their religious symbols or practices on minorities, who, as
taxpayers, are required to fund them? It's a close question involving
the balancing of harms.

Justice Scalia is sympathetic
to "the interest of the overwhelming majority of religious believers in
being able to give God thanks and supplication as a people" and
indifferent to the interests of offended religious and irreligious
minorities. In Scalia's view, the First Amendment "permits (the)
disregard of polytheists and believers in unconcerned deities, just as
it permits the disregard of devout atheists." I am sympathetic to the
sense of alienation and exclusion experienced by minorities confronted
with official majoritarian religiosity and indifferent to interests of
the majority in obtaining official endorsements and expressions of their
views. It is probably no coincidence that Scalia is a devout believer
and (to say the least) I am not.

Although I wouldn't bother
challenging routine references to God on dollar bills, in the Pledge of
Allegiance (which everyone has a right not to recite), or in the presidential oath of office -- partly for strategic reasons and partly
out of indifference -- I'd draw the line at agreeing to tolerate
publicly funded nativity scenes, Ten Commandment displays, or official
sectarian prayers at football games, graduations, and school-board
meetings, among other practices. Scalia would probably draw the line at
requiring me to pray.

But if official religious practices can
pose cognizable harm to religious and irreligious minorities, they also
offer indirect, unintended benefits: a visceral respect for minority
rights that can nurture a willingness to defend them and an
understanding of your own outsider status as a secular, non-believer in a
religious, predominantly Christian country.

Plaintiffs who
unsuccessfully challenged the religiosity of inaugural ceremonies worry
about protecting "impressionable young children" from the "coercive
imposition of religious dogma," stressing that it "stigmatizes" atheists
and "send(s) the message that God is part and parcel of Americanism." I
agree. But the message, however offensive, is partly true. In the
popular view, atheists are stereotypically un-American as well as
immoral. I understand that non-theist groups are trying to change that
view, and I wish them luck. But I doubt that initiating futile lawsuits
against popular, rote references to God is the right strategy for
effecting social change or shielding "impressionable" children from
religious dogma. How might children be protected against official
religiosity and the stigma of non-theism? Arm them with the emotional
and intellectual independence not to take either too seriously.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A new survey suggests many might prefer a kind of multipolar Washington, with three distinct orbits of power checking each other.

Does Donald Trump have a mandate?

Though last month’s election provided Trump and his fellow Republicans unified control of the White House, House of Representatives, and Senate for the first time since 2006, the latest Allstate/Atlantic Media Heartland Monitor Poll shows the country remains closely split on many of the key policy challenges facing the incoming administration—and sharply divided on whether they trust the next president to take the lead in responding to them.

In addition, on several important choices facing the new administration and Congress, the survey found that respondents who voted for Trump supported a position that was rejected by the majority of adults overall. That contrast may simultaneously encourage Trump to press forward on an agenda that energizes his coalition, while emboldening congressional Democrats to resist him.