How Do You Think?

The more I talk with others about the erroneous inclinations of the intuitive brain the more I face responses that are incredulous, emotional, and sometimes irrational. When it comes to intuition, it seems, people are quite fond of theirs. Rational thought, I am often reminded, elicits annoyance. Ramp up the annoyance when you remind people of the biases that underlie their silly beliefs. Rational and scientific thinking in the public domain is no way to win friends either. And it seems, that its use is not necessarily an effective way to win an argument. In a recent conversation with a colleague about the magical power of full moons I said something silly like the data doesn’t support a relationship between the phase of the moon and problem behaviors in classrooms. The response was “I don’t believe in data.” How do you respond to that? How do you respond to the rejection of reality?

I don’t have anything against intuitive thinking, well that may not be completely true; as it clearly is prone to error. It is however the source of creativity. My wife suggests that intuition is part of the essence of being a women: that women are socialized to value it as if it where foundational. Rejecting it is like rejecting a core piece of oneself.

I can’t imagine a world devoid of intuition. I’m not sure I want to. On the other hand, the costs of it are ever present and often very destructive. When I strive to find the balance, I struggle. Perhaps you can help me find that balance, or perhaps bolster the value of this sticky propensity. Please tell me what you think.

Have you ever seen familiar and improbable shapes in those puffy white cumulus clouds as they pass overhead? Notice the squirrel or dinosaur in the image to the right. Some of you may have you seen the recent American Express commercial that portrays items positioned in such a way that we perceive them as sad or happy faces (much like the bathtub fixture below). Now notice the “Hand of God” in the NASA image below and to the right, taken by the Chandra X-ray Observatory. This picture shows energized particles streaming from a pulsar, in a field of debris from a massive supernova. Many of us, instinctively see in this image what looks like the wrist and hand of a person (or God as the name of this nebula implies). Speaking of God, on the internet there are many more explicit examples of religious imagery in much more benign items such as tree trunks, clouds, pancakes or tortillas. This tendency is not limited to the visual sense. We make the same type of errors with auditory information (as is evident in backmasking in popular music). These tendencies, which are in fact illusory, are a consequence of our neural circuitry.

Our brains do not tolerate vague or obscure stimuli very well. We have an innate tendency to perceive clear and distinct images within such extemporaneous stimuli. This tendency is called pareidolia. It is also referred to as patternicity. This tendency is so ubiquitous that a projective personality test (the Rorschach Inkblot Test) relies on and “interprets” this inclination.*

It has been suggested that our ancestors, the ones who assigned agency to things that went bump in the night (perceiving vague data as a threat) responded in a way that facilitated survival. Those who ignored the stimuli were more likely to be predated and thus not pass on their genes. Carl Sagan noted in his classic book, The Demon Haunted World that this tendency is likely linked to other aspects of individual survival. He wrote:

“As soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper. These days, nearly every infant is quick to identify a human face, and to respond with a goony grin.

As an inadvertent side effect, the pattern recognition machinery in our brains is so efficient in extracting a face from a clutter of other detail that we sometimes see faces where there are none. We assemble disconnected patches of light and dark and unconsciously see a face. The Man in the Moon is one result”(Sagan 1995: 45).

Michael Shermer wrote of patternicity in the December 2008 issue of Scientific American Magazine. In that article Shermer wrote that scientists have historically treated patternicity as an error in cognition. More specifically he noted that this tendency is a type I error, or a false positive. A false positive in this context, is believing that something is real when, in fact, it is not. Shermer discussed a paper in the Proceedings of the Royal Society entitled “The Evolution of Superstitious and Superstition-like Behaviour” by biologists Kevin R. Foster (Harvard University) and Hanna Kokko (University of Helsinki). These scientists tested the hypothesis that patternicity will enhance survivability using evolutionary modeling. Shermer wrote “They demonstrated that whenever the cost of believing a false pattern is real is less than the cost of not believing a real pattern, natural selection will favor patternicity.” The implications Shermer wrote: “…believing that the rustle in the grass is a dangerous predator when it is only the wind does not cost much, but believing that a dangerous predator is the wind may cost an animal its life.”

It is a double edged sword it seems. Not only has this tendency entertained us and likely facilitated our very survival as a species, but it may in fact serve as the basis of our individual inclinations toward superstitious thinking. Shermer wrote:

“Through a series of complex formulas that include additional stimuli (wind in the trees) and prior events (past experience with predators and wind), the authors conclude that “the inability of individuals—human or otherwise—to assign causal probabilities to all sets of events that occur around them will often force them to lump causal associations with non-causal ones. From here, the evolutionary rationale for superstition is clear: natural selection will favour strategies that make many incorrect causal associations in order to establish those that are essential for survival and reproduction.”

Yet again this is an example of how our intuitive brain can lead us astray!

* The Rorschach inkblot test, along with most projective measures in the field of psychology, have fallen out of favor due to poor reliability and validity.

Two years ago Steven Pinker wrote an intriguing piece in the New York Times entitled The Moral Instinct. Dr. Pinker is a Harvard College Professor and Johnstone Family Professor in the Department of Psychology at Harvard University who conducts research on language and cognition. This article in many ways stirred me and lead to a paradigm shift in my thinking about morality. I am a cognitive behavioral psychologist and my training regarding moral development looked at morality as a rationally driven developmental process (Piaget & Kohlberg). In other words, it was believed that morality developed as one’s cognitive capacity to think advanced. It also helped me to get more comfortable with letting go of the notion that religion is the sole driver of morality in society.

Pinker’s article is a long one and I cannot do it justice here, but I want to share some of his major arguments.

Morality is a complex concept shaped by evolution, neurobiology, and culture. Pinker states that “Moral goodness is what gives each of us the sense that we are worthy human beings. We seek it in our friends and mates, nurture it in our children, advance it in our politics and justify it with our religions. A disrespect for morality is blamed for everyday sins and history’s worst atrocities. To carry this weight, the concept of morality would have to be bigger than any of us and outside all of us.” Looking at morality from a scientific perspective causes concern in those who hold the view that it is sacred and the unique domain of religion. Regardless, Pinker urges us to step back and look at it in a systematic way. Much research has been conducted on the concept and he touches on the most important findings that have shaped the modern understanding of this topic.

Moral judgment it seems is a “switch” on a continuum of valuations we make about other’s or our own behavior. We may judge a behavior as imprudent, unfashionable, disagreeable, or perhaps immoral. The switching point on that continuum, where judgments are made that deem a behavior immoral, is in some cases universal (e.g., rape and murder); however, the line is not so clear about other acts. For example there are individuals who today may flip the switch of immoral judgment when looking at someone eating meat (e.g., an ethical vegetarian), using paper towels, shopping at Walmart, or even smoking. The zeitgeist (accepted standard of conduct and morality), certainly does shift over time. Pinker notes “…. many behaviors have been amoralized, switched from moral failings to lifestyle choices. They include divorce, illegitimacy, being a working mother, marijuana use and homosexuality. Many afflictions have been reassigned from payback for bad choices to unlucky misfortunes.” And he adds “This wave of amoralization has led the cultural right to lament that morality itself is under assault, as we see in the group that anointed itself the Moral Majority. In fact there seems to be a Law of Conservation of Moralization, so that as old behaviors are taken out of the moralized column, new ones are added to it. Dozens of things that past generations treated as practical matters are now ethical battlegrounds, including disposable diapers, I.Q. tests, poultry farms, Barbie dolls….. Food alone has become a minefield, with critics sermonizing about the size of sodas, the chemistry of fat, the freedom of chickens, the price of coffee beans, the species of fish and now the distance the food has traveled from farm to plate.”

The root of these moralzations are not rational he argues. When people are pressed for the reasons why they find a particular behavior morally repugnant they struggle. Pinker discusses Jonathon Haidt’s research that suggests that people do not engage in moral reasoning; rather they engage in moral rationalization. According to Pinker, Haidt contends that “they begin with the conclusion, coughed up by an unconscious emotion, and then work backward to a plausible justification.” Again when pressed for justification for their judgment of certain behaviors as immoral “many people admit, “I don’t know, I can’t explain it, I just know it’s wrong.”

So, morality may not be a cognitive developmental progression. Well alright then, but where does it come from? Research is building toward substantiating that there are genetic implications – suggesting that it may very well be instinctual. Pinker contends “According to Noam Chomsky, we are born with a “universal grammar” that forces us to analyze speech in terms of its grammatical structure, with no conscious awareness of the rules in play. By analogy, we are born with a universal moral grammar that forces us to analyze human action in terms of its moral structure, with just as little awareness.” If this is the case then a moral sense should be universal, and in fact there appear to be five universal morals that transcend all cultures. Again reflecting Haidt’s research Pinker lists “… harm, fairness, community (or group loyalty), authority and purity — and suggests that they are the primary colors of our moral sense. Not only do they keep reappearing in cross-cultural surveys, but each one tugs on the moral intuitions of people in our own culture.”

If we accept that morals are in fact universal and instinctual, then how do we come to terms with the blatant discrepancies seen across cultures? Pinker contends that culture itself is the culprit. How the five spheres are ranked in terms of importance, in and across cultures, accounts for these differences. Pinker notes:

“Many of the flabbergasting practices in faraway places become more intelligible when you recognize that the same moralizing impulse that Western elites channel toward violations of harm and fairness (our moral obsessions) is channeled elsewhere to violations in the other spheres. Think of the Japanese fear of nonconformity (community), the holy ablutions and dietary restrictions of Hindus and Orthodox Jews (purity), the outrage at insulting the Prophet among Muslims (authority). In the West, we believe that in business and government, fairness should trump community and try to root out nepotism and cronyism. In other parts of the world this is incomprehensible — what heartless creep would favor a perfect stranger over his own brother? “

The cultural divide that exists today in the United States makes sense when we look at it from this perspective. Pinker writes:

“The ranking and placement of moral spheres also divides the cultures of liberals and conservatives in the United States. Many bones of contention, like homosexuality, atheism and one-parent families from the right, or racial imbalances, sweatshops and executive pay from the left, reflect different weightings of the spheres. In a large Web survey, Haidt found that liberals put a lopsided moral weight on harm and fairness while playing down group loyalty, authority and purity. Conservatives instead place a moderately high weight on all five. It’s not surprising that each side thinks it is driven by lofty ethical values and that the other side is base and unprincipled.”

Pinker delves into the neurological factors associated with morality and the evolutionary evidence and arguments for an instinctual morality. He reviews several important studies that provide evidence for these hypotheses. But, he argues that morality is more than an inheritance – it is larger than that. It is contextually driven. He notes: “At the very least, the science tells us that even when our adversaries’ agenda is most baffling, they may not be amoral psychopaths but in the throes of a moral mind-set that appears to them to be every bit as mandatory and universal as ours does to us. Of course, some adversaries really are psychopaths, and others are so poisoned by a punitive moralization that they are beyond the pale of reason. ” He further contends “But in any conflict in which a meeting of the minds is not completely hopeless, a recognition that the other guy is acting from moral rather than venal reasons can be a first patch of common ground. One side can acknowledge the other’s concern for community or stability or fairness or dignity, even while arguing that some other value should trump it in that instance.”

Pinker closes with:

“Our habit of moralizing problems, merging them with intuitions of purity and contamination, and resting content when we feel the right feelings, can get in the way of doing the right thing. Far from debunking morality, then, the science of the moral sense can advance it, by allowing us to see through the illusions that evolution and culture have saddled us with and to focus on goals we can share and defend.“

Again this comes down to getting away from intuitive thinking when it comes to important and complex issues. This not so simple, but very doable step, continues to stymie the best among us.

We are innately intuitive thinkers inclined toward making all sorts of cognitive errors as we muddle through our lives. The consequences in many cases are benign enough; however, I dare say that many an interpersonal conflict stems from such thinking. However, the consequences of this type of thinking can be huge in some circumstances. For example when these biases are carried out by those who, from a position of power (or vulnerability), deny anthropogenic climate change, we all suffer. Other deleterious errors play out in political debates over such issues as health care reform and the privatization of social security, as well as in the struggles between creationists and science minded folk in the discussions over whether to teach intelligent design as part of the science curriculum.

It really doesn’t matter on which side of the issue you stand – we are all subject to errors and biases that ultimately widen the gap between the antagonists rather than bring them closer to resolution. There is little debate about the relative impact of these biases and errors as they play out in the conversations about such complicated and contentious issues. All you have to do is listen to the soundbites and spin – regardless of the side you are on, it is plainly evident that the opposing pundits and/or “experts” come from completely different realities. Sometimes it is evident that there can be no resolution because of the lack of a foundational agreement as to the terms or rules of the discussion.

My quest for some rules of thought to serve as an inoculation, of sorts, for these pervasive and seemingly instinctual erroneous inclinations has proven difficult. Instincts it seems are hard to resist. Virtually all of the errors I have discussed have their origins in the intuitive brain, away from the higher order thinking areas of the cerebral cortex. Millions of years of evolution have honed these processes conferring a survival advantage to those who attend closely to things that go bump in the night. In the arms race for survival faced by our ancestors, quick decisions were absolutely essential. Arduous skepticism was likely lethal if not by means of predation certainly by means of ostracization. It takes an additional cognitive step – involving higher order thinking to bypass these inclinations. And as Spinoza suggested, we as a species are not inclined to take this additional step. Skepticism is difficult and perhaps even viscerally unpalatable. We must make the extra effort to employ critical thinking – particularly when the stakes are high!

It is crucially important to note that the following guidelines will only be fruitful if both sides agree to them. If not, the parties will go round and round – never really accomplishing anything.

First, we have to acknowledge the following:

A. Our default thoughts are likely intuitive thoughts and they are thus likely biased by cognitive errors. Gut-level thinking just doesn’t cut it for complex issues.

B. Things that make immediate intuitive sense are likely to be uncritically accepted. Agreeable data should not escape scrutiny.

C. Jumping to conclusions about the internal attributes of others (individuals or groups) as an explanation of behavior or circumstances is likely short sighted. We should always seek a greater understanding of the true circumstances.

As such, we must:
1. Give equal time and scrutiny to the pursuit of disconfirming information; particularly regarding agreeable facts because we are inclined toward finding data to support preconceptions.

2. No matter how much you like your hypothesis – you must always be willing to abandon it.

4. Universal application of these rules is absolutely essential. It is imprudent to apply these guidelines only as they serve your purpose(s).

5. In order to use scientific methods to investigate any concept, the concept itself must be falsifiable.

6. Be parsimonious. The simplest among equally plausible explanations is usually the best explanation.

Some issues cannot be rationally discussed particularly due to guidelines 2, 4, and 5. Issues that necessitate violation of these tenants are often ideologically driven and thus preclude rational or scientific debate. Some really big issues, such as the existence of God, or the merit of creationism most often cannot be reasonably debated following these guidelines, again because it is unlikely that both parties will agree to these guidelines. A big sticking point is that God’s existence, in particular, is not falsifiable. It therefore, is not the domain of science to either prove or disprove God’s existence. But, other big issues such as anthropogenic global climate change or the merits of health care reform can, and should be, subjected to these guidelines.

In a recent article at dbskeptic.com, titled Five Habits of the Skeptical Mind Nicholas Covington wisely detailed his suggestions for good skeptical hygiene. He included: (1) Your belief will not change reality; (2) Look for the best overall explanation of the facts; (3) Use authorities carefully; (4) Don’t confuse a possibility with a probability; and (5) Dissect your thoughts. R. C. Moore in a comment to Covington’s article added some additional strong points – including: (1) objective evidence results when all observers who follow the same protocol achieve the same results, regardless of their personal beliefs; (2) statistical error never improves with the repetition of independent samples; (3) uncalibrated experimentation is useless; and (4) while logic is very useful for modeling the behaviour of the universe, in no way does it control its behaviour. Both of these lists are helpful and wise (although I have not done them justice here). Carl Sagan’s Baloney Detection Kit is another great list.

I ask you, my readers, to add to this list. What are your rules of thought?

My previous posts addressed several common cognitive biases while briefly touching on their subsequent consequences. In review, the Fundamental Attribution Error leads us to make hasty and often erroneous conclusions about others’ personal attributes based on our superficial observations. Generally such conclusions are in fact erroneous because we lack a sufficient understanding of the situational or external circumstances associated with the behavior in question. One particularly counterproductive manifestation of this tendency is the prejudice many individuals have regarding the plight of the poor. The commonly held misbelief is that the poor are so, because they are lazy or stupid or otherwise worthy of their circumstance. Further, the Self Serving Bias is manifested as an overvaluation of the degree of internal attribution the more fortunate make regarding their own personal social and economic position. The reality is that our social economic status has more to do with heritage than with personal attributes such as hard work and discipline.

Confirmation Bias, like Spinoza’s Conjecture facilitates the internalization of information that fits our beliefs and leads us to miss, ignore, or dismiss information that challenges deeply held beliefs. We are thus likely to dismiss pertinent and valid information that may move us from deeply held beliefs. And, perhaps most importantly, these tendencies disincline us from taking the additional steps necessary to critically scrutinize intuitively logical information. Thus we filter and screen information in a way that sustains our preconceptions – rarely truly opening our minds to alternative notions.

These biases are evident throughout society but are plain to see in those who hold strong attitudes about issues such as religion and politics. The overarching implications are that we tend to cherry pick and integrate information in order to stay in our comfortable belief paradigms. For example, some Conservatives are reassured by watching Fox News because the information aired is presorted based on the core political ideology of political conservatism. Its viewers are presented with information that avoids the unpleasantness of having to legitimately deal with divergent perspectives. Similarly, creationists ignore or negate the overwhelming evidence that substantiates the theory of evolution.

It is interesting to me that the positions held by divergent individuals, liberals or conservatives and skeptics or believers are often quite emotionally based and staunchly guarded. And rarely are “facts” universally regarded as such. We are even more likely to cling to these attitudes and values and thus be more prone to such errors in times of distress or threat. It takes careful rational discipline on both sides to constructively debate these issues.

The tendency to firmly hold onto one’s beliefs, be they religious, political, or intellectual, even in the face of compellingly disconfirming evidence, is referred to as “cognitive conservatism” (Herrnstein Smith, 2010). Between groups or individuals with divergent “belief” systems, the entrenched rarely concede points and even less frequently do they change perspectives. The polar opposites jab and attack looking for the weakest point in the argument of their nemesis. These generally fruitless exchanges include ad hominem attacks and the copious use of logical fallacies.

This is clearly evident today in debates between Republicans and Democrats as they battle over public policy. The case is the same between skeptics and believers as they pointlessly battle over the existence of God (as if existence was a provable or disprovable fact). And it is interesting that some individuals and groups selectively employ skepticism only when it serves their particular interests. This is especially evident in those who make desperate attempts to discredit the evidence for evolution while demanding that different standards be employed with regard to the question of God’s existence.

Because it seems that we as humans are hard-wired with a default for intuitive thinking we are particularly susceptible to magical, supernatural, and superstitious thinking. Compound that default with a tendency to make the above discussed cognitive errors and it is no wonder that we have pervasive and intractable political partisanship and deadly religious conflicts. Further ramifications include the widespread use of homeopathic and “alternative” medicine, the anti-vaccine movement, racism, sexism, classism, and as mentioned previously, ideologically driven denial of both evolution and anthropogenic global climate change.

It is fascinating to me that how people think and at what level they think (intuitive versus rational) plays out in such globally destructive ways. How do you think?

“The kids are crazy today it must be a full moon.” This and other similar notions are widely held. For example, people working in Emergency Departments (ED) assume that spikes in ED admissions are linked to the phase of the moon. Again, the thinking is that the full moon brings out the craziness in people’s behavior. Similar links are firmly held about the relationship between the consumption of sugar and bad behavior in children. They believe that when children eat sugar, it is like consuming an amphetamine – they get wild!

Such cause and effect notions are easily dismissed when you look closely at the laws of physics or the biological plausibility of the effects of sugar on behavior. Further, if you actually look at the numbers of ED Admissions or behavior problems in schools and the phases of the moon or sugar consumption, there are no relationships. PERIOD! End of story! Yet, these beliefs are firmly held despite the evidence; which is not necessarily widely available. Why is it that we hold onto such notions?

The answer is Confirmation Bias. We are inclined to take in, and accept as true, information that supports our belief systems and miss, ignore, or discount information that runs contrary to our beliefs. For example, a full moon provides a significant visual reference to which memories can be linked. And because there is a widely held mythical belief that full moons affect behavior, we also remember those confirmations more clearly. We are less likely to remember similarly bad days that lack such a strikingly visual reference point and that do not support our beliefs. As a result, we are less likely to use that data to challenge the myth.

This bias is not limited to full moons and sugar. It transcends rational thought and is pervasive throughout the human race. It shapes our religious and political beliefs, our parenting choices, our teaching strategies, and our romantic and social relationships. It also plays a significant role in the development of stereotypes and the maintenance of prejudices. These beliefs, good or bad, when challenged, tend to elicit emotional responses (this is a topic all its own). Much has been written about these phenomena, pertaining to issues related to how and why this occurs. There are other factors as well that play a role in this erroneous thought process (e.g., communal reinforcement, folklore, the media, attribution error, expectancy effect, and Spinoza’s Conjecture); however, my goal is to raise your awareness of this bias, because knowing that we are prone to it may help us avoid drawing mistaken conclusions. Bottom line – it may help us open and widen our minds to different ideas and maybe even challenge some long held mistaken beliefs.

Last week I discussed fundamental attribution error, leaving confirmation bias and Spinoza’s Conjuncture left to explore. Today I’m going to delve into the latter. Benedict Spinoza, a 17th-century Dutch philosopher, wrote with great insight that “mere comprehension of a statement entails the tacit acceptance of its being true, whereas disbelief requires a subsequent process of rejection.” What this suggests is that we are likely to accept, as true, a statement that makes immediate sense to us. But we can also infer from this statement that we are, in general, unlikely to critically scrutinize such logical statements. A further implication is that we are likely to reject statements that don’t make immediate sense to us.

Sam Harris, a noted neuroscientist and author, and several colleagues at the University of California recently published the results of a study noting that we tend to process understood information very quickly while we process false or uncertain statements more slowly. And what is even more interesting is that we process ambiguous or uncertain statements in regions of the brain (specifically: the left inferior frontal gyrus, anterior insula, and dorsal anterior cingulate) that are associated with processing pain and disgust. Hmmm, critical thinking hurts! This is just one example of much evidence that suggests that our brains work this way.

We all look at the world through our personal lenses of experience. Our experiences shape our understanding of the world, and ultimately our understanding of the world then filters what we take in. The end result is that we may reject or ignore new and important information simply because it does not conform to our previously held beliefs. Subsequently, we may not grow or expand our understanding of the world and we may become intellectually or professionally stagnate.

It is important to remember this tendency when we are taking in novel information. New ideas that run contrary to long-held beliefs are hard to embrace regardless of the degree of merit. And we are disinclined to question the legitimacy of new information particularly if it fits our preconceptions. Challenging and/or ambiguous information, like quantum mechanics may, in some people, elicit feelings similar to pain or even disgust. Perhaps this also explains that uneasy feeling that many people experience when they think about such mind blowing concepts as our size and importance relative to the vastness of time and space. The slowed, arduous, and perhaps even painful process of thinking about such ambiguous or incongruous information may certainly discourage the endeavor. Perhaps the cliche “no pain – no gain” reasonably applies.

In my Cognitive Biases piece last week, I briefly introduced three common errors in thinking. In today’s post I am going to expand upon Attribution Error. Before I explain this cognitive bias, let’s look at some situations where such erroneous thinking occurs.

Where I work, at a preschool for children with substantial developmental delays, many of the children display persistently difficult behaviors. I occasionally hear from less seasoned staff comments suggesting that they believe that a child’s “bad” behaviors are the result of inadequate parenting. Parents are also sometimes admonished for sending their sick child to school or for sending in an inadequate lunch.

Attribution Error occurs when we negatively judge the unfortunate actions of others as a reflection of internal attributes (such as personality traits, abilities, ethics, etc.) rather than as a result of external situational factors. In other words, we often underestimate the situational circumstances that cause a person to behave as they do and overestimate the impact of their personal attributes. This error in thinking is so ubiquitous and so easy to make that it is commonly referred to as Fundamental Attribution Error.

What is even more interesting is that when we think about our mistakes we tend to overestimate the external situational factors that lead to our behavior and undervalue our internal attributes. In a nutshell, others’ mistakes are a result of their personal weaknesses, but our mistakes are due to other factors unrelated to our personal weaknesses. Stepping back and really looking at it, it becomes evident that this is not quite equitable.

We have to ask ourselves – do we really have the whole picture? Do we really understand that person’s life circumstances? Are we really aware of the resources available to them? For example, in the situation noted above, is the parent able to afford a sick day? Does she get paid sick days? Did unforeseen bills make it impossible to purchase all the makings of a fully balanced lunch?

Perhaps, before judging, we could step back, think, and apply the same criteria we use to evaluate ourselves. This is difficult because we rarely fully grasp the intimate and circumstantial details of another person’s life. This is why we are most likely to make this error regarding people we don’t know well. If we accept that we lack a complete understanding of the entire picture, it is best not to fill in the blanks with speculation about the person. I am certain you would appreciate this from others when your conduct is on the line? I know I do.

The word awesome, in my opinion, is overused. There are rare moments, however, that truly inspire a response worthy of the word. It is easy to take for granted such moments and assume that they will happen again. I have found that appreciating such moments, as they are happening, enhances the wonder and makes them all the more meaningful. I experienced one of those moments on Saturday 1/9/10 at Allegany State Park. Driving into the park, ascending the winding tree lined road to the Summit is often quite beautiful. On this particular day, the jubilant anticipation of skiing was far surpassed by the shear splendor of what unfolded before my eyes. It had snowed the night before and all the trees above 2000 feet were completely encased with silky white snow. What made it all the more spectacular was the backdrop of the cloudless sapphire blue sky. The depth of color was reminiscent of the blue I had only previously witnessed at high altitude in the Rocky Mountains. It was truly awesome! The entire cross country ski trail system wound its way through this magical zone. I cherish this memory – it was time very well spent.

Did you know that you are likely to accept as true those pieces of information that make immediate sense to you? On a similar vein, did you know that you are more likely to take in information that supports your beliefs and to reject or ignore information that runs counter to your beliefs? Lastly, did you know that you are likely to use entirely different criteria to evaluate someone else’s behavior than you use to evaluate your own?

These three tendencies are pervasive cognitive biases. They are so universal that it seems that they are hard wired into our brains. I want to spend some time exploring these biases because they commonly lead to mistakes or at least the maintenance and/or promulgation of misinformation. Over the next several weeks I will delve into these biases, one at a time, and hopefully help you avoid the erroneous trappings of your own neurology.

The first bias is known as Spinoza’s Conjecture. The 17th-century Dutch philosopher Benedict Spinoza’s wrote that “mere comprehension of a statement entails the tacit acceptance of its being true, whereas disbelief requires a subsequent process of rejection.” Sam Harris, a noted neuroscientist, has written that most people have difficulty tolerating vagueness. On the other hand he has stated that “belief comes quickly and naturally.” The end result is that “skepticism is slow and unnatural.”

The second bias known as Confirmation Bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs (Skeptic’s Dictionary). In other words we hear what we want to hear.

The third bias is Fundamental Attribution Error. This bias refers to our tendency to over estimate the influence of the internal or personal attributes of an individual and underestimate the external or situational factors when explaining the behaviors of others. This is particularly true when we don’t know the other person very well. So other people mess up because they are stupid or lazy. We make mistakes because we are tired, stressed, or have been short changed in some way.

As we will explore later, there are personal, organizational, and societal costs associated with each of these biases. This is particularly true if we are unaware of these tendencies. I’ll discuss this more next time.