Like a lot of people these days, I'm a recovering secularist. Until September 11 I accepted the notion that as the world becomes richer and better educated, it becomes less religious. Extrapolating from a tiny and unrepresentative sample of humanity (in Western Europe and parts of North America), this theory holds that as history moves forward, science displaces dogma and reason replaces unthinking obedience. A region that has not yet had a reformation and an enlightenment, such as the Arab world, sooner or later will.

It's now clear that the secularization theory is untrue. The human race does not necessarily get less religious as it grows richer and better educated. We are living through one of the great periods of scientific progress and the creation of wealth. At the same time, we are in the midst of a religious boom.

Islam is surging. Orthodox Judaism is growing among young people, and Israel has gotten more religious as it has become more affluent. The growth of Christianity surpasses that of all other faiths. In 1942 this magazine published an essay called "Will the Christian Church Survive?" Sixty years later there are two billion Christians in the world; by 2050, according to some estimates, there will be three billion. As Philip Jenkins, a Distinguished Professor of History and Religious Studies at Pennsylvania State University, has observed, perhaps the most successful social movement of our age is Pentecostalism (see "The Next Christianity," October Atlantic). Having gotten its start in Los Angeles about a century ago, it now embraces 400 million people—a number that, according to Jenkins, could reach a billion or more by the half-century mark.

Moreover, it is the denominations that refuse to adapt to secularism that are growing the fastest, while those that try to be "modern" and "relevant" are withering. Ecstatic forms of Christianity and "anti-modern" Islam are thriving. The Christian population in Africa, which was about 10 million in 1900 and is currently about 360 million, is expected to grow to 633 million by 2025, with conservative, evangelical, and syncretistic groups dominating. In Africa churches are becoming more influential than many nations, with both good and bad effects.

From the archives:

"What Is the Koran?" (January 1999) Researchers with a variety of academic and theological interests are proposing controversial theories about the Koran and Islamic history, and are striving to reinterpret Islam for the modern world. By Toby Lester

Secularism is not the future; it is yesterday's incorrect vision of the future. This realization sends us recovering secularists to the bookstore or the library in a desperate attempt to figure out what is going on in the world. I suspect I am not the only one who since September 11 has found himself reading a paperback edition of the Koran that was bought a few years ago in a fit of high-mindedness but was never actually opened. I'm probably not the only one boning up on the teachings of Ahmad ibn Taymiyya, Sayyid Qutb, and Muhammad ibn Abd al-Wahhab.

There are six steps in the recovery process. First you have to accept the fact that you are not the norm. Western foundations and universities send out squads of researchers to study and explain religious movements. But as the sociologist Peter Berger has pointed out, the phenomenon that really needs explaining is the habits of the American professoriat: religious groups should be sending out researchers to try to understand why there are pockets of people in the world who do not feel the constant presence of God in their lives, who do not fill their days with rituals and prayers and garments that bring them into contact with the divine, and who do not believe that God's will should shape their public lives.

Once you accept this—which is like understanding that the earth revolves around the sun, not vice-versa—you can begin to see things in a new way.

The second step toward recovery involves confronting fear. For a few years it seemed that we were all heading toward a benign end of history, one in which our biggest worry would be boredom. Liberal democracy had won the day. Yes, we had to contend with globalization and inequality, but these were material and measurable concepts. Now we are looking at fundamental clashes of belief and a truly scary situation—at least in the Southern Hemisphere—that brings to mind the Middle Ages, with weak governments, missionary armies, and rampant religious conflict.

From the archives:

"Trials of the Tribulation" (January 2000) In the "Left Behind" novels by Tim LaHaye and Jerry B. Jenkins things get very bad—the planet is invaded by "200 million demonic horsemen," for example, and that's before Armageddon and the Last Judgment. By Michael Joseph Gross

The third step is getting angry. I now get extremely annoyed by the secular fundamentalists who are content to remain smugly ignorant of enormous shifts occurring all around them. They haven't learned anything about religion, at home or abroad. They don't know who Tim LaHaye and Jerry B. Jenkins are, even though those co-authors have sold 42 million copies of their books. They still don't know what makes a Pentecostal a Pentecostal (you could walk through an American newsroom and ask that question, and the only people who might be able to answer would be the secretaries and the janitorial staff). They still don't know about Michel Aflaq, the mystical Arab nationalist who served as a guru to Saddam Hussein. A great Niagara of religious fervor is cascading down around them while they stand obtuse and dry in the little cave of their own parochialism—and many of them are journalists and policy analysts, who are paid to keep up with these things.

The fourth step toward recovery is to resist the impulse to find a materialistic explanation for everything. During the centuries when secularism seemed the wave of the future, Western intellectuals developed social-science models of extraordinary persuasiveness. Marx explained history through class struggle, other economists explained it through profit maximization. Professors of international affairs used conflict-of-interest doctrines and game theory to predict the dynamics between nation-states.

All these models are seductive and partly true. This country has built powerful institutions, such as the State Department and the CIA, that use them to try to develop sound policies. But none of the models can adequately account for religious ideas, impulses, and actions, because religious fervor can't be quantified and standardized. Religious motivations can't be explained by cost-benefit analysis.

Over the past twenty years domestic-policy analysts have thought hard about the roles that religion and character play in public life. Our foreign-policy elites are at least two decades behind. They go for months ignoring the force of religion; then, when confronted with something inescapably religious, such as the Iranian revolution or the Taliban, they begin talking of religious zealotry and fanaticism, which suddenly explains everything. After a few days of shaking their heads over the fanatics, they revert to their usual secular analyses. We do not yet have, and sorely need, a mode of analysis that attempts to merge the spiritual and the material.

The recovering secularist has to resist the temptation to treat religion as a mere conduit for thwarted economic impulses. For example, we often say that young Arab men who have no decent prospects turn to radical Islam. There's obviously some truth to this observation. But it's not the whole story: neither Mohammed Atta nor Osama bin Laden, for example, was poor or oppressed. And although it's possible to construct theories that explain their radicalism as the result of alienation or some other secular factor, it makes more sense to acknowledge that faith is its own force, independent of and perhaps greater than economic resentment.

Human beings yearn for righteous rule, for a just world or a world that reflects God's will—in many cases at least as strongly as they yearn for money or success. Thinking about that yearning means moving away from scientific analysis and into the realm of moral judgment. The crucial question is not What incentives does this yearning respond to? but Do individuals pursue a moral vision of righteous rule? And do they do so in virtuous ways, or are they, like Saddam Hussein and Osama bin Laden, evil in their vision and methods?

Fifth, the recovering secularist must acknowledge that he has been too easy on religion. Because he assumed that it was playing a diminishing role in public affairs, he patronized it. He condescendingly decided not to judge other creeds. They are all valid ways of approaching God, he told himself, and ultimately they fuse into one. After all, why stir up trouble by judging another's beliefs? It's not polite. The better option, when confronted by some nasty practice performed in the name of religion, is simply to avert one's eyes. Is Wahhabism a vicious sect that perverts Islam? Don't talk about it.

But in a world in which religion plays an ever larger role, this approach is no longer acceptable. One has to try to separate right from wrong. The problem is that once we start doing that, it's hard to say where we will end up. Consider Pim Fortuyn, a left-leaning Dutch politician and gay-rights advocate who criticized Muslim immigrants for their attitudes toward women and gays. When he was assassinated, last year, the press described him, on the basis of those criticisms, as a rightist in the manner of Jean-Marie Le Pen, which was far from the truth. In the post-secular world today's categories of left and right will become inapt and obsolete.

The sixth and final step for recovering secularists is to understand that this country was never very secular anyway. We Americans long for righteous rule as fervently as anybody else. We are inculcated with the notion that, in Abraham Lincoln's words, we represent the "last, best hope of earth." Many Americans have always sensed that we have a transcendent mission, although, fortunately, it is not a theological one. We instinctively feel, in ways that people from other places do not, that history is unfulfilled as long as there are nations in which people are not free. It is this instinctive belief that has led George W. Bush to respond so ambitiously to the events of September 11, and that has led most Americans to support him.

Americans are as active as anyone else in the clash of eschatologies. Saddam Hussein sees history as ending with a united Arab nation globally dominant and with himself revered as the creator of a just world order. Osama bin Laden sees history as ending with the global imposition of sharia. Many Europeans see history as ending with the establishment of secular global institutions under which nationalism and religious passions will be quieted and nation-states will give way to international law and multilateral cooperation. Many Americans see history as ending in the triumph of freedom and constitutionalism, with religion not abandoned or suppressed but enriching democratic life.

We are inescapably caught in a world of conflicting visions of historical destiny. This is not the same as saying that we are caught in a world of conflicting religions. But understanding this world means beating the secularist prejudices out of our minds every day.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A new survey suggests many might prefer a kind of multipolar Washington, with three distinct orbits of power checking each other.

Does Donald Trump have a mandate?

Though last month’s election provided Trump and his fellow Republicans unified control of the White House, House of Representatives, and Senate for the first time since 2006, the latest Allstate/Atlantic Media Heartland Monitor Poll shows the country remains closely split on many of the key policy challenges facing the incoming administration—and sharply divided on whether they trust the next president to take the lead in responding to them.

In addition, on several important choices facing the new administration and Congress, the survey found that respondents who voted for Trump supported a position that was rejected by the majority of adults overall. That contrast may simultaneously encourage Trump to press forward on an agenda that energizes his coalition, while emboldening congressional Democrats to resist him.