Belief systems are installed at multiple layers of Maslow’s Hierarchy of Needs. Popular religions do this at most or all layers. Books about business, relationships, etc, do this at singular layers. Someone can be Christian and yet also base their entire Belonging layer belief system in How to Win Friends & Influence People. In this sense, everyone is multi-religious.

• Allows belief system installation without threatening core beliefs. • Belief systems installed at layers other than the Transcendence layer are easily accepted by the Transcendence layer as long as they do not attempt to override core beliefs. • Systems must still pass logic/emotional checks at each layer. • Ask what the current generation’s issues are in each layer. E.g. Safety/Belonging: Isolation is a huge topic today but not in previous generations.• Choices that pit two layers against one another initiate a “high road, low road” ultimatum. • Beliefs and layers can be synergistic. E.g. a belief that “all people are born good” reinforces beliefs about forgiveness. • What happens when a parent belief is surrounded by dead children beliefs? E.g. someone who believes in Jesus yet not in miracles, original sin, Satan, Heaven, etc. Are child beliefs the faith gathering leaves of the parent? • Is it possible to live with all layers contradicting the Transcendence layer? • Is it possible to install a multilayer system on enough layers that you can override the Transcendence layer? • Pyramid because represents our beginning as babies in the physical layer.

IN THE SUMMER of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.

While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed by economists. Henrich used a “game”—along the lines of the famous prisoner’s dilemma—to see whether isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery—the same evolved rational and psychological hardwiring.

The test that Henrich introduced to the Machiguenga was called the ultimatum game. The rules are simple: in each game there are two players who remain anonymous to each other. The first player is given an amount of money, say $100, and told that he has to offer some of the cash, in an amount of his choosing, to the other subject. The second player can accept or refuse the split. But there’s a hitch: players know that if the recipient refuses the offer, both leave empty-handed. North Americans, who are the most common subjects for such experiments, usually offer a 50-50 split when on the giving end. When on the receiving end, they show an eagerness to punish the other player for uneven splits at their own expense. In short, Americans show the tendency to be equitable with strangers—and to punish those who are not.

Among the Machiguenga, word quickly spread of the young, square-jawed visitor from America giving away money. The stakes Henrich used in the game with the Machiguenga were not insubstantial—roughly equivalent to the few days’ wages they sometimes earned from episodic work with logging or oil companies. So Henrich had no problem finding volunteers. What he had great difficulty with, however, was explaining the rules, as the game struck the Machiguenga as deeply odd.

When he began to run the game it became immediately clear that Machiguengan behavior was dramatically different from that of the average North American. To begin with, the offers from the first player were much lower. In addition, when on the receiving end of the game, the Machiguenga rarely refused even the lowest possible amount. “It just seemed ridiculous to the Machiguenga that you would reject an offer of free money,” says Henrich. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game.”

Joe Henrich and research assistant Vilisi adminster the Third Party Punishment Game in the village of Teci on Fiji’s Yasawa Island.

The potential implications of the unexpected results were quickly apparent to Henrich. He knew that a vast amount of scholarly literature in the social sciences—particularly in economics and psychology—relied on the ultimatum game and similar experiments. At the heart of most of that research was the implicit assumption that the results revealed evolved psychological traits common to all humans, never mind that the test subjects were nearly always from the industrialized West. Henrich realized that if the Machiguenga results stood up, and if similar differences could be measured across other populations, this assumption of universality would have to be challenged.

Henrich had thought he would be adding a small branch to an established tree of knowledge. It turned out he was sawing at the very trunk. He began to wonder: What other certainties about “human nature” in social science research would need to be reconsidered when tested across diverse populations?

Henrich soon landed a grant from the MacArthur Foundation to take his fairness games on the road. With the help of a dozen other colleagues he led a study of 14 other small-scale societies, in locales from Tanzania to Indonesia. Differences abounded in the behavior of both players in the ultimatum game. In no society did he find people who were purely selfish (that is, who always offered the lowest amount, and never refused a split), but average offers from place to place varied widely and, in some societies—ones where gift-giving is heavily used to curry favor or gain allegiance—the first player would often make overly generous offers in excess of 60 percent, and the second player would often reject them, behaviors almost never observed among Americans.

The research established Henrich as an up-and-coming scholar. In 2004, he was given the U.S. Presidential Early Career Award for young scientists at the White House. But his work also made him a controversial figure. When he presented his research to the anthropology department at the University of British Columbia during a job interview a year later, he recalls a hostile reception. Anthropology is the social science most interested in cultural differences, but the young scholar’s methods of using games and statistics to test and compare cultures with the West seemed heavy-handed and invasive to some. “Professors from the anthropology department suggested it was a bad thing that I was doing,” Henrich remembers. “The word ‘unethical’ came up.”

So instead of toeing the line, he switched teams. A few well-placed people at the University of British Columbia saw great promise in Henrich’s work and created a position for him, split between the economics department and the psychology department. It was in the psychology department that he found two kindred spirits in Steven Heine and Ara Norenzayan. Together the three set about writing a paper that they hoped would fundamentally challenge the way social scientists thought about human behavior, cognition, and culture.

A MODERN LIBERAL ARTS education gives lots of lip service to the idea of cultural diversity. It’s generally agreed that all of us see the world in ways that are sometimes socially and culturally constructed, that pluralism is good, and that ethnocentrism is bad. But beyond that the ideas get muddy. That we should welcome and celebrate people of all backgrounds seems obvious, but the implied corollary—that people from different ethno-cultural origins have particular attributes that add spice to the body politic—becomes more problematic. To avoid stereotyping, it is rarely stated bluntly just exactly what those culturally derived qualities might be. Challenge liberal arts graduates on their appreciation of cultural diversity and you’ll often find them retreating to the anodyne notion that under the skin everyone is really alike.

If you take a broad look at the social science curriculum of the last few decades, it becomes a little more clear why modern graduates are so unmoored. The last generation or two of undergraduates have largely been taught by a cohort of social scientists busily doing penance for the racism and Eurocentrism of their predecessors, albeit in different ways. Many anthropologists took to the navel gazing of postmodernism and swore off attempts at rationality and science, which were disparaged as weapons of cultural imperialism.

Economists and psychologists, for their part, did an end run around the issue with the convenient assumption that their job was to study the human mind stripped of culture. The human brain is genetically comparable around the globe, it was agreed, so human hardwiring for much behavior, perception, and cognition should be similarly universal. No need, in that case, to look beyond the convenient population of undergraduates for test subjects. A 2008 survey of the top six psychology journals dramatically shows how common that assumption was: more than 96 percent of the subjects tested in psychological studies from 2003 to 2007 were Westerners—with nearly 70 percent from the United States alone. Put another way: 96 percent of human subjects in these studies came from countries that represent only 12 percent of the world’s population.

Henrich’s work with the ultimatum game was an example of a small but growing countertrend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that, like Henrich’s work with the Machiguenga, challenged long-held assumptions of human psychological universality.

Some of that research went back a generation. It was in the 1960s, for instance, that researchers discovered that aspects of visual perception were different from place to place. One of the classics of the literature, theMüller-Lyer illusion, showed that where you grew up would determine to what degree you would fall prey to the illusion that these two lines are different in length:

Researchers found that Americans perceive the line with the ends feathered outward (B) as being longer than the line with the arrow tips (A). San foragers of the Kalahari, on the other hand, were more likely to see the lines as they are: equal in length. Subjects from more than a dozen cultures were tested, and Americans were at the far end of the distribution—seeing the illusion more dramatically than all others.

More recently psychologists had challenged the universality of research done in the 1950s by pioneering social psychologist Solomon Asch. Asch had discovered that test subjects were often willing to make incorrect judgments on simple perception tests to conform with group pressure. When the test was performed across 17 societies, however, it turned out that group pressure had a range of influence. Americans were again at the far end of the scale, in this case showing the least tendency to conform to group belief.

As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies. When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.

The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do—the rituals, eating preferences, codes of behavior, and the like—but in the way they mold our most fundamental conscious and unconscious thinking and perception.

For instance, the different ways people perceive the Müller-Lyer illusion likely reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions.

When unconsciously translated in three dimensions, the line with the outward-feathered ends (C) appears farther away and the brain therefore judges it to be longer. The more time one spends in natural environments, where there are no carpentered corners, the less one sees the illusion.

As the three continued their work, they noticed something else that was remarkable: again and again one group of people appeared to be particularly unusual when compared to other populations—with perceptions, behaviors, and motivations that were almost always sliding down one end of the human bell curve.

In the end they titled their paper “The Weirdest People in the World?” (pdf)By “weird” they meant both unusual and Western, Educated, Industrialized, Rich, and Democratic. It is not just our Western habits and cultural preferences that are different from the rest of the world, it appears. The very way we think about ourselves and others—and even the way we perceive reality—makes us distinct from other humans on the planet, not to mention from the vast majority of our ancestors. Among Westerners, the data showed that Americans were often the most unusual, leading the researchers to conclude that “American participants are exceptional even within the unusual population of Westerners—outliers among outliers.”

Given the data, they concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations. Researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.

NOT LONG AGO I met Henrich, Heine, and Norenzayan for dinner at a small French restaurant in Vancouver, British Columbia, to hear about the reception of their weird paper, which was published in the prestigiousjournal Behavioral and Brain Sciences in 2010. The trio of researchers are young—as professors go—good-humored family men. They recalled that they were nervous as the publication time approached. The paper basically suggested that much of what social scientists thought they knew about fundamental aspects of human cognition was likely only true of one small slice of humanity. They were making such a broadside challenge to whole libraries of research that they steeled themselves to the possibility of becoming outcasts in their own fields.

“We were scared,” admitted Henrich. “We were warned that a lot of people were going to be upset.”

“We were told we were going to get spit on,” interjected Norenzayan.

“Yes,” Henrich said. “That we’d go to conferences and no one was going to sit next to us at lunchtime.”

Interestingly, they seemed much less concerned that they had used the pejorative acronym WEIRD to describe a significant slice of humanity, although they did admit that they could only have done so to describe their own group. “Really,” said Henrich, “the only people we could have called weird are represented right here at this table.”

Still, I had to wonder whether describing the Western mind, and the American mind in particular, as weird suggested that our cognition is not just different but somehow malformed or twisted. In their paper the trio pointed out cross-cultural studies that suggest that the “weird” Western mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group. WEIRD minds are also more analytic, possessing the tendency to telescope in on an object of interest rather than understanding that object in the context of what is around it.

The WEIRD mind also appears to be unique in terms of how it comes to understand and interact with the natural world. Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, however, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.

Given that people living in WEIRD societies don’t routinely encounter or interact with animals other than humans or pets, it’s not surprising that they end up with a rather cartoonish understanding of the natural world. “Indeed,” the report concluded, “studying the cognitive development of folkbiology in urban children would seem the equivalent of studying ‘normal’ physical growth in malnourished children.”

During our dinner, I admitted to Heine, Henrich, and Norenzayan that the idea that I can only perceive reality through a distorted cultural lens was unnerving. For me the notion raised all sorts of metaphysical questions: Is my thinking so strange that I have little hope of understanding people from other cultures? Can I mold my own psyche or the psyches of my children to be less WEIRD and more able to think like the rest of the world? If I did, would I be happier?

Henrich reacted with mild concern that I was taking this research so personally. He had not intended, he told me, for his work to be read as postmodern self-help advice. “I think we’re really interested in these questions for the questions’ sake,” he said.

The three insisted that their goal was not to say that one culturally shaped psychology was better or worse than another—only that we’ll never truly understand human behavior and cognition until we expand the sample pool beyond its current small slice of humanity. Despite these assurances, however, I found it hard not to read a message between the lines of their research. When they write, for example, that weird children develop their understanding of the natural world in a “culturally and experientially impoverished environment” and that they are in this way the equivalent of “malnourished children,” it’s difficult to see this as a good thing.

THE TURN THAT HENRICH, Heine, and Norenzayan are asking social scientists to make is not an easy one: accounting for the influence of culture on cognition will be a herculean task. Cultures are not monolithic; they can be endlessly parsed. Ethnic backgrounds, religious beliefs, economic status, parenting styles, rural upbringing versus urban or suburban—there are hundreds of cultural differences that individually and in endless combinations influence our conceptions of fairness, how we categorize things, our method of judging and decision making, and our deeply held beliefs about the nature of the self, among other aspects of our psychological makeup.

We are just at the beginning of learning how these fine-grained cultural differences affect our thinking. Recent research has shown that people in “tight” cultures, those with strong norms and low tolerance for deviant behavior (think India, Malaysia, and Pakistan), develop higher impulse control and more self-monitoring abilities than those from other places. Men raised in the honor culture of the American South have been shown to experience much larger surges of testosterone after insults than do Northerners. Research published late last year suggested psychological differences at the city level too. Compared to San Franciscans, Bostonians’ internal sense of self-worth is more dependent on community status and financial and educational achievement. “A cultural difference doesn’t have to be big to be important,” Norenzayan said. “We’re not just talking about comparing New York yuppies to the Dani tribesmen of Papua New Guinea.”

As Norenzayan sees it, the last few generations of psychologists have suffered from “physics envy,” and they need to get over it. The job, experimental psychologists often assumed, was to push past the content of people’s thoughts and see the underlying universal hardware at work. “This is a deeply flawed way of studying human nature,” Norenzayan told me, “because the content of our thoughts and their process are intertwined.” In other words, if human cognition is shaped by cultural ideas and behavior, it can’t be studied without taking into account what those ideas and behaviors are and how they are different from place to place.

This new approach suggests the possibility of reverse-engineering psychological research: look at cultural content first; cognition and behavior second. Norenzayan’s recent work on religious belief is perhaps the best example of the intellectual landscape that is now open for study. When Norenzayan became a student of psychology in 1994, four years after his family had moved from Lebanon to America, he was excited to study the effect of religion on human psychology. “I remember opening textbook after textbook and turning to the index and looking for the word ‘religion,’ ” he told me, “Again and again the very word wouldn’t be listed. This was shocking. How could psychology be the science of human behavior and have nothing to say about religion? Where I grew up you’d have to be in a coma not to notice the importance of religion on how people perceive themselves and the world around them.”

Norenzayan became interested in how certain religious beliefs, handed down through generations, may have shaped human psychology to make possible the creation of large-scale societies. He has suggested that there may be a connection between the growth of religions that believe in “morally concerned deities”—that is, a god or gods who care if people are good or bad—and the evolution of large cities and nations. To be cooperative in large groups of relative strangers, in other words, might have required the shared belief that an all-powerful being was forever watching over your shoulder.

If religion was necessary in the development of large-scale societies, can large-scale societies survive without religion? Norenzayan points to parts of Scandinavia with atheist majorities that seem to be doing just fine. They may have climbed the ladder of religion and effectively kicked it away. Or perhaps, after a thousand years of religious belief, the idea of an unseen entity always watching your behavior remains in our culturally shaped thinking even after the belief in God dissipates or disappears.

Why, I asked Norenzayan, if religion might have been so central to human psychology, have researchers not delved into the topic? “Experimental psychologists are the weirdest of the weird,” said Norenzayan. “They are almost the least religious academics, next to biologists. And because academics mostly talk amongst themselves, they could look around and say, ‘No one who is important to me is religious, so this must not be very important.’” Indeed, almost every major theorist on human behavior in the last 100 years predicted that it was just a matter of time before religion was a vestige of the past. But the world persists in being a very religious place.

HENRICH, HEINE, AND NORENZAYAN’S FEAR of being ostracized after the publication of the WEIRD paper turned out to be misplaced. Response to the paper, both published and otherwise, has been nearly universally positive, with more than a few of their colleagues suggesting that the work will spark fundamental changes. “I have no doubt that this paper is going to change the social sciences,” said Richard Nisbett, an eminent psychologist at the University of Michigan. “It just puts it all in one place and makes such a bold statement.”

More remarkable still, after reading the paper, academics from other disciplines began to come forward with their own mea culpas. Commenting on the paper, two brain researchers from Northwestern University argued(pdf) that the nascent field of neuroimaging had made the same mistake as psychologists, noting that 90 percent of neuroimaging studies were performed in Western countries. Researchers in motor developmentsimilarly suggested that their discipline’s body of research ignored how different child-rearing practices around the world can dramatically influence states of development. Two psycholinguistics professors suggested that their colleagues had also made the same mistake: blithely assuming human homogeneity while focusing their research primarily on one rather small slice of humanity.

At its heart, the challenge of the WEIRD paper is not simply to the field of experimental human research (do more cross-cultural studies!); it is a challenge to our Western conception of human nature. For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve.

Henrich has challenged this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own. He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way. When Henrich asked Fijian women why they avoided certain potentially toxic fish during pregnancy and breastfeeding, he found that many didn’t know or had fanciful reasons. Regardless of their personal understanding, by mimicking this culturally adaptive behavior they were protecting their offspring. The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.

The applications of this new way of looking at the human mind are still in the offing. Henrich suggests that his research about fairness might first be applied to anyone working in international relations or development. People are not “plug and play,” as he puts it, and you cannot expect to drop a Western court system or form of government into another culture and expect it to work as it does back home. Those trying to use economic incentives to encourage sustainable land use will similarly need to understand local notions of fairness to have any chance of influencing behavior in predictable ways.

Because of our peculiarly Western way of thinking of ourselves as independent of others, this idea of the culturally shaped mind doesn’t go down very easily. Perhaps the richest and most established vein of cultural psychology—that which compares Western and Eastern concepts of the self—goes to the heart of this problem. Heine has spent much of his career following the lead of a seminal paper published in 1991 by Hazel Rose Markus, of Stanford University, and Shinobu Kitayama, who is now at the University of Michigan. Markus and Kitayama suggested that different cultures foster strikingly different views of the self, particularly along one axis: some cultures regard the self as independent from others; others see the self as interdependent. The interdependent self—which is more the norm in East Asian countries, including Japan and China—connects itself with others in a social group and favors social harmony over self-expression. The independent self—which is most prominent in America—focuses on individual attributes and preferences and thinks of the self as existing apart from the group.

The classic “rod and frame” task: Is the line in the center vertical?

That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason, Heine argues. Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on something called the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.

Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression). Heine points to Nisbett at Michigan, who has argued (pdf) that the analytic/holistic dichotomy in reasoning styles can be clearly seen, respectively, in Greek and Chinese philosophical writing dating back 2,500 years. These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.

And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers, in other words, have been the predictable consequences of the WEIRD mind doing the thinking.

]]>Can we bring the Greek Gods back, please?https://www.opensourcereligion.com/2013/02/24/can-we-bring-the-greek-gods-back-please/
Sun, 24 Feb 2013 04:14:19 +0000http://www.opensourcereligion.com/?p=452Has anyone else noticed modern organized religion is kind of a bummer? Even if your divine belief system isn’t violently persecuting another, it seems like you’re still trapped in a church singing dirges all Sunday. Modern religion doesn’t have any flair. This is why I’d like to offer a modest proposal: Let’s bring back the ancient Greek gods. Yes, I mean Zeus, Hera, Apollo, Aphrodite, Ares, the whole shebang — and here’s why I think they’d make a significant improvement over our current options.

They’re relatable. The Greek gods are definitely gods, but they’re also still recognizably human. They have the same emotions, problems and insecurities as regular humans do, and thus, they’re far more understandable than nebulous clouds or old bearded men on thrones. The Greek gods actually know what people go through in their lives, because they experience the same feelings. This may make the Greek gods fallible, but it also makes them far more relatable than other divine beings.

They have variety. If you’re part of a monotheistic religion, your god is kind of your one-stop shopping for divinity. You’re stuck with them, no matter what happens in your life, no matter what your current needs are. But there are tons of Greek gods! Don’t think Apollo is getting the job done? Switch to Hephaestus. Have a specific home-related issue coming up? Then pray to Hestia, goddess of the hearth. While a monotheistic god pretty much handles everything for his followers, the Greek gods know how to delegate, giving followers options based on need, preference and situation.

They’re easily adaptable to modern life. Most major religions haven’t had a serious update for at least a millennium or so. As such, it can be hard to truly integrate these religions into modern times. But thanks to their diversity, the Greek gods would snap right into place. Hermes is obviously the god of cellphones, emails and text messages. As a craftsman, Hephaestus would probably handle all computers and network issues, while Demeter would watch over restaurants. Apollo, the god of YouTube videos. You can’t tell me that life wouldn’t be at least a little bit easier if we had a god specifically handling YouTube videos.

They are extremely open-minded. Greek gods do not care what you are. They don’t care about your gender, the color of your skin, or your sexual preference. They have never told anyone to start a war (except Ares, the god of war, and even then it was just to be having a war, not to persecute other groups). In fact, it was generally ancient Greeks who started their own wars, and then asked the gods for help, at which point they’d pick sides. All I’m saying is that the Greek gods never inspired any holy wars, never gave anyone shit for not believing in them, and never demanded their followers proselytize. Because the Greek gods only cared about themselves, and the side benefit of that self-centeredness was a refreshing lack of prejudice.

You know where you stand with a Greek god. The Greek gods are like hormonal teenagers. Their emotions run high, and can change at a drop of a hat. They’re easily angered and easily enamored, but they can be managed. You know they’re going to be voltaile, so you can deal with that — and if they happen to be nice and kind to you, hey, bonus. The Greek gods didn’t suddenly change from a brimstone-and-fire-worship-me-or-I’ll-smite-you Old Testament-type thing to a love-everybody-hippie-dippie New Testament-thing, completely contradicting themselves. They’ve always been self-centered jerks, making them consistent, if nothing else.

Greek gods will have sex with you. That’s pretty awesome. Just knowing you have a chanceto score with a god or goddess adds a certain zest to life. Now admittedly, some time the Greek gods got a little… er, rape-y, and that’s not cool. On the other hand, Law & Order: SVU would become super exciting.

They make at least as much sense as the other guys. One of the biggest problems with the Judeo-Christian God that Christian scholars have tried to rationalize over the centuries is how a good and loving god could allow evil to exist; while they’ve come up with plenty of answer, none of them are particularly satisfying. This isn’t an issue for the Greek gods, because they aren’t pretending to be omnipotent and loving. Like humans, they can be good and evil themselves. You don’t have to wonder why the Greek gods let bad things happen to good people, because the Greek gods can simply be assholes. They care about you as long as you’re caring/genuflecting/sacrificing bulls to them. Tit for tat. Honestly, just take a look around. Does it seem like the universe is currently being run by one omniscient guy who completely loves everybody or by a bunch of over-emotional, self-centered jerks? I rest my case.

They’re so much more fun. Here’s a short list of things we could do if we brought back the Greek gods:
• Go to oracles.
• Go on quests.
• Fight monsters.
• Challenge gods to contests.
• Go to Hades and try to rescue dead loved ones.
• Dip babies in magic rivers, making them invulnerable.
Now, not all of those are good ideas — most of them are insanely dangerous — but man, they’re still a hell of a lot more exciting than sitting in church for an hour every Sunday.
See on io9.com

From Malcolm Gladwell to the Freakonomics guys to (discredited) science writer Jonah Lehrer, writers these past few years have flooded bookstores with popular nonfiction titles that purport to tell us how we think. But something has been lost amid the recent vogue for cognitive science and behavioral economics. What about the human part of human behavior — the dreams and desires that set us apart from animals and computers? Are we just assemblages of neurons and chemicals?

Adam Phillips, a prolific British writer and psychoanalyst, is one of the few prominent voices in the social sciences who defends a more abstract, mysterious human mind against the certainties of biology and cognitive science. In previous books he has wrestled with the virtues of monogamy, with the meaning of promises and with the fate of kindness in the modern world.

In Missing Out, his slightly messy but deeply humane new book, he turns to dissatisfaction: How do we cope with not having what we desire, or not being who we want to be? Although dissatisfaction may cause us pain, Phillips concedes, we shouldn’t think of it as a weakness to be overcome. It’s a natural part of human existence, and one that can ultimately provide us pleasure if we let it. “We may need to think of ourselves as always living a double life,” Phillips writes, “the one we wish for and the one that we practice; the one that never happens and the one that keeps happening.”

In a series of five essays, on themes ranging from the uses of frustration to the pleasures of misunderstanding, he shows us how to negotiate the tension between who we are and what we crave. (A sixth essay, on madness in the theater, is listed as an “appendix” but has almost nothing to do with the remainder of the book; perhaps it’s here to pad out a publication of under 200 pages.) For citizens of the U.S. and other rich nations, “affluence has allowed more people than ever before to think of their lives in terms of choices and options.” And modern media have rendered these other possible lives tantalizingly visible. That should and can be a source of pleasure, but it also wears us down.

Our desires exceed our abilities to satisfy them. That’s just life. Yet too often we berate ourselves for not fulfilling our fantasies. Phillips argues that, contrary to the old Freudian tradition that adults have to moderate their desires and come to terms with disappointment, we should embrace frustration as a window into our true selves. Of course some frustrations are unhealthy. (I’m never going to be a tennis pro or a ballet dancer, and it’s high time I accept it.) Just as often, though, dissatisfaction teaches us what we really want. We dream of a perfect meal or perfect vacation or perfect lover, only to find that the real world disappoints. But disappointment allows us to learn from experience, to think about our desires, and eventually to find satisfaction in something between what we desire and what we get. “We prefer our satisfactions without their requisite frustrations,” he writes, but “it is only from our sense of frustration that we get a clue about the possibilities of satisfaction.”

Although Phillips is a practicing child psychologist, he doesn’t rely on case studies to make his arguments. In most of Missing Out, he sounds much more like a literary critic than a shrink, drawing examples from novelists like Graham Greene, from poets like John Ashbery or Philip Larkin and, in particular, from Shakespeare. He spends pages dissecting King Lear and even more time on Othello, a play that’s all about how people react when they want something they can’t have. Othello himself, with his self-destructive need for certainty about his wife’s fidelity, is Phillips’ ultimate tragic hero. (As a psychoanalyst, Phillips’ bardolatry is understandable; according to the literary critic Harold Bloom, whom Phillips cites in this book, Freud was just “a prose version of Shakespeare.”)

For readers of a more scientific bent, Phillips’ frequent appeals to literature may seem out of place. His style, too — full of head-scratching paradoxes and qualified propositions (couched in phrases like “we might wonder” or “we might even say”) — can sometimes be so fluid that it’s difficult to pin down exactly what the writer believes. Yet Missing Out isn’t supposed to be a scientific treatise on the architecture of the brain. It’s a meditation on who we are that forgoes easy answers in favor of better questions. Because Phillips believes that, for imperfect, desiring creatures like us, the easy answers may be the most harmful ones. To avoid slipping into anger or revenge, he concludes, “We need … to have better — more interesting, more enlivening, more satisfying — conversations about our frustrations.” His book is a very good place to start.
See on www.npr.org

As we head into a new year, the guardians of traditional religion are ramping up efforts to keep their flocks—or, in crass economic terms, to retain market share. Some Christians have turned to soul searching while others have turned to marketing. Last fall, the LDS church spent millions on billboards, bus banners, and Facebook ads touting “I’m a Mormon.” In Canada, the Catholic Church has launched a “Come Home” marketing campaign. The Southern Baptists Convention voted to rebrand themselves. A hipster mega-church in Seattle combines smart advertising with sales force training for members and a strategy the Catholics have emphasized for centuries: competitive breeding.

In October of 2012 the Pew Research Center announced that for the first time ever Protestant Christians had fallen below 50 percent of the American population. Atheists cheered and evangelicals beat their breasts and lamented the end of the world as we know it. Historian of religion, Molly Worthen, has since offered big picture insights that may dampen the most extreme hopes and fears. Anthropologist Jennifer James, on the other hand, has called fundamentalism the “death rattle” of the Abrahamic traditions.

In all of the frenzy, few seem to give any recognition to the player that I see as the primary hero, or, if you prefer, culprit—and I’m not talking about science populizer and atheist superstar Neil deGrasse Tyson. Then again, maybe Iam talking about Tyson in a sense, because in his various viral guises—as a talk show host and tweeter and as the face on scores of smartass Facebook memes—Tyson is an incarnation of the biggest threat that organized religion has ever faced: the internet.

A traditional religion, one built on “right belief,” requires a closed information system. That is why the Catholic Church put an official seal of approval on some ancient texts and banned or burned others. It is why some Bible-believing Christians are forbidden to marry nonbelievers. It is why Quiverfull moms home school their kids from carefully screened text books. It is why, when you get sucked into conversations with your fundamentalist uncle George from Florida, you sometimes wonder if he has some superpower that allows him to magically close down all avenues into his mind. (He does!)

Religions have spent eons honing defenses that keep outside information away from insiders. The innermost ring wall is a set of certainties and associated emotions like anxiety and disgust and righteous indignation that block curiosity. The outer wall is a set of behaviors aimed at insulating believers from contradictory evidence and from heretics who are potential transmitters of dangerous ideas. These behaviors range from memorizing sacred texts to wearing distinctive undergarments to killing infidels. Such defenses worked beautifully during humanity’s infancy. But they weren’t really designed for the current information age.

Tech-savvy mega-churches may have twitter missionaries, and Calvinist cuties may make viral videos about how Jesus worship isn’t a religion, it’s a relationship, but that doesn’t change the facts: the free flow of information is really, really bad for the product they are selling. Here are five kinds of web content that are like, well, like electrolysis on religion’s hairy toes.

Radically cool science videos and articles. Religion evokes some of our most deeply satisfying emotions: joy, for example, and transcendence, and wonder. This is what Einstein was talking about when he said that “science without religion is lame.” If scientific inquiry doesn’t fill us at times with delight and even speechless awe at new discoveries or the mysteries that remain, then we are missing out on the richest part of the experience. Fortunately, science can provide all of the above, and certain masters of the trade and sectors of the internet are remarkably effective at evoking the wonder—the spirituality if you will—of the natural world unveiled. Some of my own favorites include Symphony of science, NOVA, TED, RSA Animate, and Birdnote.

It should be no surprise that so many fundamentalists are determined to take down the whole scientific endeavor. They see in science not only a critic of their outdated theories but a competitor for their very best product, a sense of transcendent exuberance. For millennia, each religion has made an exclusive claim, that it alone had the power to draw people into a grand vision worth a lifetime of devotion. Each offered the assurance that our brief lives matter and that, in some small way, we might live on. Now we are getting glimpses of a reality so beautiful and so intricate that it offers some of the same promise. Where will the old tribal religions be if, in words of Tracy Chapman, we all decide that Heaven’s here on earth?

Curated Collections of Ridiculous Beliefs. Religious beliefs that aren’t yours often sound silly, and the later in life you encounter them the more laughable they are likely to sound. Web writers are after eyeballs, which means that if there’s something ridiculous to showcase then one is guaranteed to write about it. It may be a nuanced exposé or a snarky list or a flaming meme, but the point, invariably, is to call attention to the stuff that makes you roll your eyes, shake your head in disbelief, laugh, and then hit Share.

The Kinky, Exploitative, Oppressive, Opportunistic and Violent Sides of Religion. Of course, the case against religion doesn’t stop at weird and wacky. It gets nasty, sometimes in ways that are titillating and sometimes in ways that are simply dark. The Bible is full of sex slavery, polygamy and incest, and these are catalogued at places like Evilbible.com. Alternately, a student writing about holidays can find a proclamation in which Puritans give thanks to God for the burning of Indian villages or an interview on the mythic origins of the Christmas story. And if the Catholic come home plea sounds a little desperate, it may well be because the sins of the bishops are getting hard to cover up. On the net, whatever the story may be, someone will be more than willing to expose it.

Supportive communities for people coming out of religion. With or without the net (but especially with it) believers sometimes find their worldview in pieces. Before the internet existed most people who lost their faith kept their doubts to themselves. There was no way to figure out who else might be thinking forbidden thoughts. In some sects, a doubting member may be shunned, excommunicated, or “disfellowshipped” to ensure that doubts don’t spread. So, doubters used keep silent and then disappear into the surrounding culture. Now they can create websites, and today there are as many communities of former believers as there are kinds of belief. These communities range from therapeutic to political, and they cover the range of sects: Evangelical, Mormon, Jehovah’s Witness, and Muslim. There’s even a web home for recovering clergy. Heaven help the unsuspecting believer who wanders into one of these sites and tries to tell members in recovery that they’re all bound for hell.

Lifestyles of the fine and faithless. When they emerge from the recovery process former Christians and Muslims and whatnot find that there’s a whole secular world waiting for them on the web. This can be a lifesaver, literally, for folks who are trapped in closed religious communities on the outside. On the web, they can explore lifestyles in which people stay surprisingly decent and kind without a sacred text or authority figures telling them what to do. In actuality, since so much of religion is about social support (and social control) lots of people skip the intellectual arguments and exposes, and go straight to building a new identity based in a new social network. Some web resources are specifically aimed creating alternatives to theism, for example, Good without God, Parenting Beyond Belief, or The Foundation Beyond Belief.

Interspiritual Okayness. This might sound odd, but one of the threats to traditional religion is interfaith communities that focus on shared spiritual values. Many religions make exclusive truth claims and see other religions as competitors. Without such claims, there is no need for evangelism, missionaries or a set of doctrines that I call donkey motivators (ie. carrots and sticks) like heaven and hell. The web showcases the fact that humanity’s bad and good qualities are universal, spread across cultures and regions, across both secular and religious wisdom traditions. It offers reassurance that we won’t lose the moral or spiritual dimension of life if we outgrow religion, while at the same time providing the means to glean what is truly timeless and wise from old traditions. In doing so, it inevitably reveals that the limitations of any single tradition alone. The Dalai Lama, who has lead interspiritual dialogue for many years made waves recently by saying as much: “All the world’s major religions, with their emphasis on love, compassion, patience, tolerance, and forgiveness can and do promote inner values. But the reality of the world today is that grounding ethics in religion is no longer adequate. This is why I am increasingly convinced that the time has come to find a way of thinking about spirituality and ethics beyond religion altogether.”

The power of interspiritual dialogue is analogous to the broader power of the web in that, at the very heart it is about people finding common ground, exchanging information, and breaking through walls to find a bigger community waiting outside. Last year, Jim Gilliam, founder of Nationbuilder, gave a talk titled, “The Internet is My Religion.” Gilliam is a former fundamentalist who has survived two bouts of cancer thanks to the power of science and the internet. His existence today has required a bone marrow transplant and a double lung transplant organized in part through social media. Looking back on the experience, he speaks with the same passion that drove him when he was on fire for Jesus:

I owed every moment of my life to countless people I would never meet. Tomorrow, that interconnectedness would be represented in my own physical body. Three different DNAs. Individually they were useless, but together they would equal one functioning human. What an incredible debt to repay. I didn’t even know where to start. And that’s when I truly found God. God is just what happens when humanity is connected. Humanity connected is God.

The Vatican, and the Mormon Quorum of the Twelve Apostles, and the Southern Baptist Convention should be very worried.

LOS ALTOS, CA, January 10, 2013 – A study published in the July 2012 issue of Explore provides further evidence that the prayers of one individual used to treat a physical condition of another may – or may not – help.

How’s that for definitive?

Depending on the methods used, the condition being treated, and the individuals involved, “results may vary” for those scientists trying to figure out if prayer – the most commonly used form of complementary and alternative medicine (CAM), according to one government survey – really works.

In this particular study, a team of researchers from the Institute of Noetic Sciences (IONS) and the University of California, San Francisco (UCSF) set out to determine if distant healing intention (DHI) is effective in treating surgical wounds. DHI is defined by the study’s authors as “a compassionate mental act intended to improve the health and well-being of another person at a distance.” Some of the terms used to describe DHI are intercessory prayer, spiritual healing, intentionality, energy healing, shamanic healing, nonlocal healing, noncontact therapeutic touch, and Reiki.

Seventy-two women undergoing elective surgery – some for reconstruction after breast cancer surgery and some for cosmetic reasons – were divided into three groups: a blinded group receiving DHI, a blinded group not receiving DHI, and an unblinded group receiving DHI. This configuration allowed the researchers to study the effect of their subjects’ expectations by varying the degree to which they knew they were being prayed for.

Here’s what they found:

“The more that participants believed in distant healing, and the more they thought that distant healing was actually focused on them, the worse they did on both objective and subjective measures. In addition, the better the healers thought that they were doing, again, the worse the participants’ outcomes.”

At first blush this would appear to validate the arguments of those who say that prayer is nothing more than wishful thinking – a generally harmless but occasionally dangerous practice. However, the researchers weren’t so quick to come to that same conclusion. While admitting the possibility that DHI effects do not exist, they also considered the possibility that DHI effects do exist but that “the relevant variables that modulate these effects are not well understood and interact in complex ways.”

An increasing amount of evidence indicates that one of the most important variables to consider is the thought of the individual being treated. Qualities of thought such as self-doubt, anger, and fear have long been known to have a negative impact on health, whereas qualities such as forgiveness, gratitude, and compassion – even a belief in a divine power – can have a positive effect.

Then there’s the thought of the one providing treatment.

In a study that mirrors somewhat the work done by IONS/UCSF, a team of researchers led by Harvard Medical School’s Dr. Herbert Benson concluded that intercessory prayer had an adverse effect on patients recovering from coronary bypass surgery who were aware that someone was praying for them. However, what most news organizations reporting on these findings failed to mention was that those who were doing the praying belonged to a religious group that, according to Indiana University religious professor, Candy Gunther Brown, “have long denied that prayer works ‘miracles,’ and have even called petitionary prayer ‘useless.’”

The question is, will we ever be able to determine if prayer, for another or for one’s self, can have a positive impact on health?

As the IONS/UCSF study suggests, it will likely require further study – even the development of new theories or an entirely different methodology – before we reach anything approaching a definitive answer. In the meantime, those who have found prayer to be an effective means of treating the body (e.g.), even in lieu of conventional medicine, will likely continue praying. And some day – some day – we just might have a better understanding of the source of their confidence and success.

Eric Nelson is a Christian Science practitioner, whose articles on the link between consciousness and health appear regularly in a number of local, regional, and national online publications. He also serves as the media and legislative spokesperson for Christian Science in Northern California.