"The United States is a Christian nation.” If I had a nickel for every time I’ve heard this statement at a religious Right meeting or in the media, I wouldn’t be rich—but I’d probably have enough to buy a really cool iPad. The assertion is widely believed by followers of the religious Right and often repeated—and, too often, it seeps into the beliefs of the rest of the population as well. But like other myths that are widely accepted (you use only 10 percent of your brain, vitamin C helps you get over a cold, and the like), it lacks a factual basis.

Over the years, numerous scholars, historians, lawyers, and judges have debunked the “Christian nation” myth. Yet it persists. Does it have any basis in American history? Why is the myth so powerful? What psychological need does it fill?