When my brother David was three, he learned how to pee standing up. He was so excited, that the next time he saw sister Kathy, age twenty, heading for the bathroom, he ran after her to share his newfound knowledge. But for some reason unbeknown to him, he failed to convince her that she could do it too. After trying several times to explain, he got so frustrated that he burst into tears. Sharing our thoughts, ideas, convictions and discoveries—especially counterintuitive or secret information—feels good to us, really good. Our desire to discover and share information literally defines the kind of creatures we are.

Information gets passed through hierarchical social networks. Some scientists describe our species as “social information specialists.” That is our ecological niche. In other words, we are social animals, and the way that we survive and thrive is by gathering information, which we mostly get from other human beings. We transfer information vertically from generation to generation and horizontally between members of a generation.

This means that our sense of reality is socially constructed. It is largely a product of what other people think and the information they choose to share with us.

We also are a hierarchical species, which is relevant to information flow in a couple of ways: people gain status by having valuable information to share, and we look to those who have more power and status than us—in other words authority figures—to provide us with particularly valuable information.

Truth is a valuable commodity. Human beings, for the most part, are deeply invested in knowing what is real, because accurate information has survival value. Consequently, we have evolved all sorts of mechanisms for discerning error and deception. For example, we read eye contact and body language, and we often have an intuitive suspicion of things that sound “too good to be true.”

Once we get fooled by someone or trust their well-meaning but misguided advice, we are wary thereafter; and rebuilding trust can be challenging. Fool me once, shame on you; fool me twice, shame on me. Across both secular and religious wisdom traditions, lists of virtues include words like honesty, integrity, skepticism, veracity, truth-seeking and trustworthiness. Religions (and political parties) proclaim themselves sole purveyors of truth, often written with a capital “T.”

This is not to say that honesty always rules. People deceive each other for countless reasons, and some take particular pleasure in getting away with it—magicians, for example, or on a darker note, sociopaths. But even among people who are perfectly ordinary and morally intact, white lies are ubiquitous and black lies common. (Evolution likely primed us for some optimal balance between being deceitful and being perceived as trustworthy.)

Also we don’t just trick others. Healthy adults indulge some virtually-universal and probably-adaptive patterns of self-deceit—thinking, for example, that we are more competent and better-liked that is actually the case. We regularly fall into biases called confirmatory thinking and motivated reasoning, and, in fact, knowledge exploded only after we figured out a system for erecting barriers against these biases. We call that system the scientific method, and it is basically a procedure for forcing ourselves to ask the questions that could show us wrong.

But regardless of humanity’s penchant for fudging the facts, being able to figure out what’s real has tremendous survival value, and so we care about it a lot. We hate being in the dark and love being in the know. And when we think we have learned something valuable, we want to share that information with other people–which is why, right now, I am sitting at my computer writing this article rather than painting the walls of my basement bedroom.

This whole system of trading information exists to help us survive and thrive, largely by making it possible for us to figure out the cause-and-effect relationships that govern our well-being. Picture us as little rats in a lab who have to push buttons for food, but some of them will shock us. From other rats who share our cage, we learn which buttons to push without having to try them all out. But if we imitate a rat who has the rules wrong, we might end up hungry or shocked. Consequently, our ability to discern what is real is pretty good.

But it’s far from perfect.

We share patterns of blind spots. Most people have heard that our eyes don’t bother scanning everything in our environment. Rather, we scan for certain features and then literally fill in the blank spots by hallucinating. This works better than mapping every detail because it is so much faster and more efficient, and also, our optic nerves or external objects can obstruct part of our visual field.

But our eyes and visual cortex aren’t the only part of us that relies on short cuts. Behavioral economist Daniel Kahneman won a Nobel Prize for his research on the fact that our brains have two different processing modalities, which he calls thinking fast and slow.

Thinking slow is the conscious reasoning that we assume underpins most of our decisions. In reality, much of what we believe and do gets decided by thinking fast—a kind of information processing that is quick and intuitive, largely subconscious, and prone to specific kinds of glitches. Even the most vigilant truth-seeker does the hard work of conscious thought only a minority of the time, when incoming information gets flagged for extra scrutiny—perhaps because it is counterintuitive or emotionally displeasing, or because it comes from someone we mistrust.

Our blind spots and short cuts, some of them hard wired and some formed of habit, create opportunities for misinformation to slip past our defenses. False ideas can feel true when they are not. They can avoid detection and correction. And when this happens, some get passed from person to person and generation to generation.

I said earlier that we sometimes like tricking other people and tricking ourselves. (In fact, research shows that the best way to trick someone else is to trick yourself.) But a majority of the bad information we share is stuff that we truly believe, and we honestly think we are doing others a favor by passing it on. Most of the harm people do in the world is done with good intentions, and passing on misinformation is no exception.

One kind of false information that gets passed on by well-meaning people is superstition. If you took an introduction to psychology class at some point, you may recall B.F. Skinner’s experiments which showed that even pigeons develop superstitious behaviors when they can’t figure out why some of their efforts pay off and some don’t. When Skinner dropped feed pellets into their box on a random schedule, three quarters of the pigeons developed peculiar rituals that they—apparently—thought helped to bring down the food.

We are vastly more intelligent than pigeons but, also, our lives are vastly more complex. And one thing we do that pigeons can’t is spread superstitions by word of mouth. But we don’t just pass on anything and everything. The superstitious information we share is mostly stuff that is perfectly suited to take advantage of our blind spots and logical glitches, exploiting our dependence on the powerful subconscious processes that enable thinking fast.

It feels real. Very real.

And that, in an odd way, is by design.

———————

Part 2 in a 4-part series. Read Part 3 here. This series was abridged as a single article in The Humanist – A magazine of critical inquiry and social concern. May/June 2017 issue, p. 16-21.