How Do You Think?

Have you ever wondered what makes a pundit a pundit? I mean really! Is there pundit school or a degree in punditry? Given what I hear, I can only imagine that what would be conferred upon graduation is a B.S. of different, more effluent sort. I mean REALLY!

I am certain that many of you have heard the rhetoric spewed by many of the talking heads on television and talk radio. This is true regardless of their alleged political ideology. And even more alarming, it seems to me, is that the more bombastic they are, the more popular they are. A pundit is supposed to be an expert – one with greater knowledge and insight than the general population – and subsequently they should possess the capacity to analyze current scenarios and draw better conclusions about the future than typical folk.

However, what we typically hear is two or more supremely confident versions of reality. You name the issue, be it anthropogenic global warming, health care reform, the value of free market systems, virtually no two pundits can agree unless of course they are political brethren.

Have you ever wondered if any one has ever put the predictive reliability of these so called experts to a test? Well, Philip Tetlock, a psychology professor at UC Berkley, has done just that. In 1984 Tetlock undertook such an analysis and his initial data was so so alarming (everybody had called the future wrong with regard to the cold war and demise of the USSR) that he decided to embark on what was to eventually become a two decade long quantitative analysis of, and report card on, the true predictive capabilities of professional pundits.

In 2005 Tetlock published his findings in his book, Expert political judgment: How good is it? How can we know? The results were again surprising. He analyzed the predictions made by over 280 professional experts. He gave each a series of professionally relevant real life situations and asked them to make probability predictions pertaining to three possible outcomes (often in the form of things will: stay the same, get better, or get worse). Further, Tetlock interviewed each expert to evaluate the thought processes used to draw their conclusions.

In the end, after nearly twenty years of predictions and real life playing itself out, Tetlock was able to analyze the accuracy of over 82,000 predictions. And the results were conclusive – the pundits performed worse than random chance in predicting outcomes within their supposed areas of expertise. These experts were able to accurately predict the future less than 33% of the time and non-specialists did equally as well. And to make matters worse, the most famous pundits were the least accurate. A clear pattern emerged – confidence in one’s predictions was highly correlated with error. Those who were most confident about their predictions were most often the least accurate. He noted that the most confident, despite their inaccuracy, were in fact the most popular! Tetlock noted that they were essentially blinded by their certainty.

Jonah Lehrer in How We Decide wrote of Tetlock’s study and stated “When pundits were convinced that they were right, they ignored any brain areas that implied that they might be wrong. This suggests that one of the best ways to distinguish genuine from phony expertise is to look at how a person responds to dissonant data. Does he or she reject the data out of hand? Perform elaborate mental gymnastics to avoid admitting error? He also suggested that people should “ignore those commentators that seem too confident or self assured. The people on television who are most certain are almost certainly going to be wrong.”

You might be surprised that the vast majority of the pundits actually believed that they were engaging in objective and rational analysis when drawing their conclusions.

So, experts, rationally analyzing data, drawing conclusions with less than random chance accuracy? One has to question either their actual level of expertise or the objectivity of their analysis. Tetlock suggests that they are “prisoners of their preconceptions.”

This begs the question: Is this an error of reason or an error of intuition? Jonah Lehrer suggests that this error is actually played out as one cherry picks which feelings to acknowledge and which to ignore. Lehrer noted: “Instead of trusting their gut feelings, they found ways to disregard the insights that contradicted their ideologies… Instead of encouraging the arguments inside their heads, these pundits settled on answers and then came up with reasons to justify those answers.”

Chabris and Simons in the The Invisible Gorilla discuss why we are taken in by the pundits despite their measurable incompetence and why they likely make the errors that they do. The bottom line is that such ubiquitous errors (made by novices and experts alike) are in fact illusions of knowledge perpetrated by intuition and further that we are suckers for confidence.

First of all, our intuitive inclination is to overly generalize and assume that one’s confidence is a measure of one’s competence. Such an assumption is appropriate in situations where one personally knows the limits of the individual’s capabilities. When it comes to pundits, few people know the supposed expert well enough to accurately assess whether he or she is worthy of their confidence. Regardless, people prefer and are drawn toward confidence. Our intuitive attraction to, and trust in confidence, sets us up for error. It is the illusion of confidence.

Chabris and Simons then review numerous stories and studies that “show that even scientific experts can dramatically overestimate what they know.” They demonstrate how we confuse familiarity with knowledge – and that when our knowledge is put to the test “…our depth of understanding is sufficiently shallow that we may exhaust our knowledge after just the first question. We know that there is an answer, and we feel that we know it, but until asked to produce it we seem blissfully unaware of the shortcomings in our own knowledge.” They add:

“And even when we do check our knowledge, we often mislead ourselves. We focus on those snippets of information that we do possess, or can easily obtain, but ignore all of the elements that are missing, leaving us with the impression that we understand everything we need to.“

So what can we safely conclude?

For certain, we should be aware of the limits of our knowledge and be ever vigilant so as to remain skeptical about what other experts espouse (particularly if they come off as being very confident). Tetlock suggests that responsible pundits should state their predictions in measurable terms – so that they are subject to analysis – both for error correction/learning and accountability purposes. Further he discusses the importance of placing predictions within error bars denoting the probability of accuracy. Chabris and Simons contend that only through rational analytic thought can we overcome the illusion of knowledge. We have to stave off our intuitive inclination to trust bold, black and white predictions; we have to accept that complicated issues demand complicated solutions; and that predicting the future is very difficult. As such, we need to get more comfortable with probabilities and become more skeptical of certainties. As for the pundits – they are not worth listening to – they are almost always wrong – and all they really do is polarize the process and the nation. We need to inform one another of this – and ultimately make an active rational choice to stop victimizing ourselves.

Over the last couple months I have submitted posts proclaiming the potency of intuition. One of my major resources has been Malcolm Gladwell’s Blink: The Power of Thinking Without Thinking. Among Gladwell’s tenets, the most prominent was the power of intuition and its relative supremacy, in certain situations, over rational thought. I have also heavily referenced Jonah Lehrer’s, How We Decide. Lehrer argues that there is not in fact, a Platonic Dichotomy that establishes rationality in a supreme and distinct role over intuition. Instead, he suggests that emotion plays a key role in making decisions, much more so than has historically been acknowledged. Lehrer, however, uses more scientific scrutiny and relies more heavily on research than does Gladwell.

Currently I am reading The Invisible Gorillaby Daniel J. Simons and Christopher F. Chabris. These cognitive psychologists are best known for their Invisible Gorilla study illustrating selective attention. These authors appear to be on a mission to resurrect rational thought by highlighting the inherent weaknesses of intuition. Gladwell in particular comes under scrutiny by these authors for his alleged glorification of rapid cognition.

Not only have Gladwell’s hypotheses come under attack, so to has his journalistic approach. Simons and Chabris efficiently deconstruct a couple of Gladwell’s anecdotes as examples of illusions manifested by intuition. Contrary to the message of Blink, Simons and Chabris contend that intuition is inherently problematic and detail automatic illusions that spring forth as manifested by the adaptive unconscious.

Anecdotal evidence is inherently flawed yet amazingly compelling. Gladwell, they acknowledge, is a master story teller, and he uses this talent to effectively support his contentions. They argue, however, that he falls prey to the very illusions of intuition that he is ultimately celebrating.

Jonah Lehrer seems to escape Simons’ and Chabris’ scrutiny – yet this may simply be an artifact of release date. How We Decide was released in 2009 while Gladwell’s Blink was released in 2007. Whereas Blink appears on the surface to be a celebration of intuition, Lehrer instead puts a microscope on the brain and the interplay of reason and emotion. He identifies the regions in the brain thought to be involved in these functions and highlights the research that systematically debunks the notion of reason and emotion being distinct epic foes battling it out for supremacy. Lehrer does not seem to celebrate the relative power of intuition over reason, but instead makes it clear that emotion, acting as a messenger of intuition, actually plays a crucial role in reason itself.

Rarely are parts in complex systems clearly distinct. Dividing brain function into dichotomous terms like reason and intuition is just another example of a flawed human inclination to pigeon hole nature or make issues black and white. Although Gladwell puts a more positive spin on intuition than has historically been the case, he also makes an effort to identify at least some of its shortcomings. Lehrer brings into focus the complexity and interconnectedness of the system and dispels the traditional dichotomy. Simons and Chabris scientifically scrutinize the Gladwellian notion of the supremacy of intuition. Their skeptical message lacks the sex appeal of thinking without thinking, but it is very important just the same. I look forward to detailing parts of The Invisible Gorilla in the weeks to come.

Believe it or not, freewill, to a large extent, is an illusion. For the most part, what you do, as you go through your day is based on decisions made outside of your conscious awareness. Many of these decisions involve a complicated and largely unconscious interplay among various brain regions that each struggle for control of your behavior.

One has to be careful to avoid anthropomorphic tendencies when trying to understand this epic struggle. It is not as though there are specific Freudian (Id, Ego, Superego) forces at play, each with a specific and unique mission. In reality it is more like chemical warfare going on in your brain – where neurotransmitters are released by those relevant brain centers based on current environmental circumstances (what your senses perceive in the world), your previous experiences in similar circumstances, and your treasure trove of knowledge. The subsequent emotions triggered by those neurotransmitters are then weighed out in the orbitofrontal cortex (OFC) in what has essentially been a tug of war involving varying measures of reinforcement and punishment.

Most of us are unaware of this neurological process and are under the illusion that we go through life making rational reason-based decisions. Although we may live within this illusion, the people who layout super center floor plans or produce advertisements know the truth. This discrepancy in knowledge makes you vulnerable. They use their knowledge of how the brain works in a manipulative and concerted effort to help you part ways with your hard earned money. It is not really a conspiracy, it is just an effort to gain a competitive advantage. It’s business.

Following is an abbreviated explanation of the brain systems in play and then an expose of how marketers use our brains against us. This information is drawn from Jonah Lehrer’s excellent book entitled How We Decide.

First there is the dopamine reward pathway. Dopamine is a neurotransmitter that serves a number of important functions in the brain. One of its most cogent roles is played out as a result of activation of the nucleus accumbens (NAcc). When the NAcc is activated it floods the brain with dopamine and we as a result experience pleasure. Desire for an item activates the NAcc. Being in the presence of the desired item activates it further. The greater the arousal of the NAcc the more pleasure we experience. It is your NAcc that is responsible for the happiness you feel when you eat a piece of chocolate cake, or listen to your favorite song, or watch your sports team win an exciting game (Lehrer, 2009).

Then there is the insula – a brain region that produces among other sensations, aversive feelings. In a New York Times article on the insula, Sandra Blakeslee (2006) noted that this center “lights up” in brain scans when people feel pain, anticipate pain, empathize with others, see disgust on someone’s face, are shunned in a social settings, and decide not to buy an item. In many cases we avoid exciting the insula as it is the system that produces the unpleasantness of caffeine or nicotine withdrawal and the negative feelings associated with spending money.

Super stores are designed to excite your NAcc and quiet the insula. You can’t help but notice when you walk into a Target, Walmart, Lowes, or even Pier 1 Imports just how much stuff is there – most of which you do not possess. Just by entering the store you have aroused your NAcc and the associated cravings. Lehrer (2009) notes:

“Just look at the interior of a Costco warehouse. It’s no accident that the most coveted items are put in the most prominent places. A row of high-definition televisions lines the entrence. The fancy jewelry, Rolex watches, iPods, and other luxury items are conspicuously placed along the corridors with the heaviest foot traffic. And then there are the free samples of food, liberally distributed throughout the store. The goal of a Costco is to constantly prime the pleasure centers of the brain, to keep us lusting after things we don’t need. Even though you probably wont buy the Rolex, just looking at the fancy watch makes you more likely to buy something else, since the desired item activates the NAcc. You have been conditioned to crave a reward.”

He further noted:

“But exciting the NAcc is not enough; retailers must also inhibit the insula. This brain area is responsible for making sure you don’t get ripped off, and when it’s repeatedly assured by retail stores that low prices are “guaranteed,” or that a certain item is on sale, or that it’s getting the “wholesale price,” the insula stops worrying so much about the price tag. In fact, researchers have found that when a store puts a promotional sticker next to a price tag – something like “Bargain Buy!” or “Hot Deal!” – but doesn’t actually reduce the price, sales of that item still dramatically increase. The retail tactics lull the brain into buying more things, since the insula is pacified. We go broke convinced that we are saving money.”

I hypothesize that the frequently redundant catalogs that routinely fill our mailboxes from retailers like LLBean and Lands End work on our brains much like super centers do. They excite the NAcc with idealized images modeled by perfect pretty people. They pacify the insula by noting improved features, sales, and deep discounts on closeouts. The necessary use of credit cards, Lehrer (2009) notes, has an additional inhibitory affect on the insula. When the insula is calm and you are primed with dopamine, the pleasure center has a disproportional amount of control. You may think you have complete rational control over this – but all this takes place outside of your direct awareness and plays out as feelings that guide your behavior. I further hypothesize that online retail stores work in a similar way (although for some the insula may be aroused by security issues pertaining to using a credit card online). Regardless, substantial marketing attempts by companies like EMS, REI, Victoria’s Secrets, LLBean, Bath & Body Works fill my in box, always hoping to draw in my NAcc and pacify my insula and subsequently open my wallet. You have to guess that the amount of money devoted to catalogs and internet marketing pays off for these companies or they wouldn’t do it.

Being aware of one’s neurology and how we are manipulated may help us mediate these unconscious forces and thus help us make better decisions. I myself try to avoid Malls and stores like Target because of the feelings they create in me. And for this very reason, I’ve stopped routinely looking at catalogs. I try to shop based only on need – not want. I’m making progress – but it is hard – these patterns have been in place and reinforced for a long time.

For nearly as long as humans have been thinking about thinking, one of the most intriguing issues has been the interplay of reason and emotion. For the greatest thinkers throughout recorded history, reason has reigned supreme. The traditional paradigm has been one of a dichotomy where refined and uniquely human REASON pitches an ongoing battle for control over animalistic and lustful EMOTIONS. It has been argued by the likes of Plato, Descartes, Kant and and even Thomas Jefferson that reason is the means to enlightenment and that emotion is the sure road to human suffering (Lehrer, 2009).

This Platonic dichotomy remains a pillar of Western thought (Lehrer, 2009). Suppressing your urges is a matter of will – recall the mantras “Just say no!” or “Just do it!” My guess is that most people today continue to think of the brain in these terms. Until recently even the cognitive sciences reinforced this notion. Only through very recent advances in the tools used to study the brain (e.g., fMRI) and other ingenious studies (e.g., Damasio’s IGT) has any evidence been generated to place this traditional paradigm in doubt. As it turns out, emotion plays a very crucial role in decision making. Without it, our ability to reason effectively is seriously compromised. I have long believed that feelings and emotions should be under the control of our evolutionary gift – the frontal cortex. Reason, after all, is what sets us apart from the other animals. Instead it is important to understand that we have learned that these forces are NOT foes but essentially collaborative and completely interdependent forces.

The implications of this recent knowledge certainly do not suggest that it is fruitless to employ our reason and critical thinking capabilities as we venture through life. Reason is crucial and it does set us apart from other life forms that lack such fully developed frontal cortices. This part of the outdated concept is correct. However, we are wrong to suppose that emotion with regard to decision making lacks value or that it is a villainous force.

Jonah Lehrer, in his book, How We Decide discusses this very issue and notes that: “The crucial importance of our emotions – the fact that we can’t make decisions without them – contradicts the conventional view of human nature, with its ancient philosophical roots.” He further notes:

“The expansion of the frontal cortex during human evolution did not turn us into purely rational creatures, able to ignore our impulses. In fact, neuroscience now knows that the opposite is true: a significant part of our frontal cortex is involved with emotion. David Hume, the eighteenth-century Scottish philosopher who delighted in heretical ideas, was right when he declared that reason was the “the slave of the passions.”

So how does this work? How do emotion and critical thinking join forces? Neuroscientists now know that the orbitofrontal cortex (OFC) is the brain center where this interplay takes place. Located in the lower frontal cortex (the area just above and behind your eyes), your OFC integrates a multitude of information from various brain regions along with visceral emotions in an attempt to facilitate adaptive decision making. Current neuroimaging evidence suggests that the OFC is involved in monitoring, learning, as well as the memorization of the potency of both reinforcers and punishers. It operates within your adaptive unconscious – analyzing the available options, and communicating its decisions by creating emotions that are supposed to help you make decisions.

Next time you are faced with a decision, and you experience an associated emotion – it is the result of your OFC’s attempt to tell you what to do. Such feelings actually guide most of our decisions.

Most animals lack an OFC and in our primate cousins, this cortical area is much smaller. As a result, these other organisms lack the capacity to use emotions to guide their decisions. Lehrer notes: “From the perspective of the human brain, Homo sapiens is the most emotional animal of all.”

I am struck by the reality that natural selection has hit upon this opaque approach to guide behavior. This just reinforces the notion that evolution is not goal directed. Had evolution been goal directed or had we been intelligently designed don’t you suppose a more direct or more obviously rational process would have been devised? The reality of the OFC even draws into question the notion of free will – which is a topic all its own.

This largely adaptive brain system of course has draw backs and limitations – many of which I have previously discussed (e.g., implicit associations, cognitive conservatism, attribution error, cognitive biases, essentialism, pareidolia). This is true, in part, because these newer and “higher” brain functions are relatively recent evolutionary developments and the kinks have yet to be worked out (Lehrer, 2009). I also believe that perhaps the complexities and diversions of modernity exceed our neural specifications. Perhaps in time, natural selection will take us in a different direction, but none of us will ever see this. Regardless, by learning about how our brains work, we certainly can take an active role in shaping how we think. How do you think?

Recently, Fox News, aired a story posing the question as to whether Fred Rogers was evil. Why you may ask, would anyone use the word evil in reference to such a gentle man? They were suggesting that his you’re special message fostered unworthy self esteem and in effect ruined an entire generation of children. This accusation inspired a fair amount of discourse that in some cases boiled down to the question of why children today have such hollow needy shells. An example of the discourse on this topic can be seen at Bruce Hood’s blog in an article entitled Mr. Rogers is Evil According to Fox News.

The consensus among skeptics was that Mr. Rogers was not, in fact, evil and that he is not responsible for the current juvenile generation’s need for much praise and attention for relatively meaningless contribution. There was almost universal acknowledgment of the problem however, and discussions lead to troubling issues such as grade inflation at schools and universities and poor performance in the workplace. In an intriguing article by Carol Mithers in the Ladies Home Journal entitled Work Place Warsaddresses the workplace implications of this phenomena. Mithers notes:

“.…. the Millennials — at a whopping 83 million, the biggest generation of all…. are technokids, glued to their cell phones, laptops, and iPods. They’ve grown up in a world with few boundaries and think nothing of forming virtual friendships through the Internet or disclosing intimate details about themselves on social networking sites. And, many critics charge, they’ve been so coddled and overpraised by hovering parents that they enter the job market convinced of their own importance. Crane calls them the T-ball Generation for the childhood sport where “no one fails, everyone on the team’s assured a hit, and every kid gets a trophy, just for showing up.”

Workers of this generation are known for their optimism and energy — but also their demands: “They want feedback, flexibility, fun, the chance to do meaningful work right away and a ‘customized’ career that allows them to slow down or speed up to match the different phases of life,” says Ron Alsop, author of The Trophy Kids Grow Up: How the Millennial Generation Is Shaking Up the Workplace.“

I find it ironic that the very people today who struggle with the behavior of the Millennials are the ones who shaped the behaviors of concern. I personally have struggled with the rampant misapplication of praise, attention, and the provision of reinforcement for meaningless achievements. I have seen this everywhere – in homes, schools, youth athletic clubs, you name it. It has been the most recent parenting zeitgeist. But where did this philosophy come from?

Throughout my doctoral training in psychology (late 80’s and early 90’s) I learned that reinforcement is a powerful tool, but it was clear to me that it has to be applied following behaviors you WANT to increase. Nowhere in my studies did I read of the importance of raising children through the application of copious amounts of reinforcement just to bolster their self esteem. I am aware of no evidence based teachings that suggest this approach. However, given the near universal application of these practices it must of come from somewhere. This very question, I’m sure, lead to the placement of responsibility squarely on the shoulders of poor Mr. Rogers.

Although the source of this approach remains a mystery to me, Dr. Carol Dweck’s work clarifies the process of the outcome. In an interview in Highlights, Dr. Dweck discusses Developing a Growth Mindset. Dr. Dweck has identified two basic mindsets that profoundly shape the thinking and behavior both we as adults exhibit and foster in our children. She refers to these as the Fixed Mindset and the Growth Mindset. People with a Fixed Mindset, Dr. Dweck notes in the Highlights article, “believe that their achievements are based on innate abilities. As a result, they are reluctant to take on challenges.” Dweck further notes that “People with Growth Mindsets believe that they can learn, change, and develop needed skills. They are better equipped to handle inevitable setbacks, and know that hard work can help them accomplish their goals.” In this same article “She suggests that we should think twice about praising kids for being “smart” or “talented,” since this may foster a Fixed Mindset. Instead, if we encourage our kids’ efforts, acknowledging their persistence and hard work, we will support their development of a Growth Mindset – better equipping them to learn, persist and pick themselves up when things don’t go their way.”

Dweck’s conclusions are based on extensive research that clearly supports this notion. Jonah Lehrer, in his powerful book, How We Decide discussed the relevance of Dweck’s most famous study. This work involved more than 400 fifth grade students in New York City, who were individually given a set of relatively simple non-verbal puzzles. Upon completing the puzzles the students were provided with one of two one-sentence praise statements. Half of the participants were praised for their innate intelligence (e.g., “You must be smart at this.”). The other half were praised for their effort (e.g., “You must have worked really hard.”).

All participants were then given a choice between two subsequent tasks – one described as a more challenging set of puzzles (paired with the assurance that they would learn a lot from attempting) and a set of easier puzzles like the ones the subjects just completed. In summarizing Dweck’s results, Lehrer noted “Of the group of kids that had been praised for their efforts, 90 percent chose the harder set of puzzles. However, of the kids that were praised for their intelligence , most went for the easier test.” Dweck concludes that praise statements that focus on intelligence encourage risk avoidance. The “smart” children do not want to risk having their innate intelligence come under suspicion. It is better to take the safe route and maintain the perception and feeling of being smart.

Dweck went on to demonstrate how this fear of failure can inhibit learning. The same participants were then given a third set of puzzles that were intentionally very difficult in order to see how the children would respond to the challenge. Those who were praised for their effort on the initial puzzles worked diligently on the very difficult puzzles and many of them remarked about how much they enjoyed the challenge. The children who were praised for their intelligence were easily discouraged and quickly gave up. Their innate intelligence was challenged – perhaps they were not so smart after all. Then all subjects were subjected to a final round of testing. This set of puzzles had a degree of difficulty comparable to the first relatively simple set. Those participants praised for their effort showed marked improvements in their performance. On average their scores improved by 30 percentage points. Those who were praised for their intelligence, the very children who had just had their confidence shaken by the very difficult puzzles, on average scored 20 percentage points lower than they had on the first set. Lehrer noted in reference to the participants praised for their effort that “Because these kids were willing to challenge themselves, even if it meant failing at first, they ended up performing at a much higher level.” With regard to the participants praised for intelligence Lehrer writes “The experience of failure had been so discouraging for the “smart” kids that they actually regressed.”

In the Highlights interview Dweck suggests:

“It’s a mistake to think that when children are not challenged they feel unconditionally loved. When you give children easy tasks and praise them to the skies for their success, they come to think that your love and respect depend on their doing things quickly and easily. They become afraid to do hard things and make mistakes, lest they lose your love and respect. When children know you value challenges, effort, mistakes, and learning, they won’t worry about disappointing you if they don’t do something well right away.”

She further notes:

“The biggest surprise has been learning the extent of the problem—how fragile and frightened children and young adults are today (while often acting knowing and entitled). I watched as so many of our Winter Olympics athletes folded after a setback. Coaches have complained to me that many of their athletes can’t take constructive feedback without experiencing it as a blow to their self-esteem. I have read in the news, story after story, how young workers can hardly get through the day without constant praise and perhaps an award. I see in my own students the fear of participating in class and making a mistake or looking foolish. Parents and educators tried to give these kids self-esteem on a silver platter, but instead seem to have created a generation of very vulnerable people.”

So, we have an improved understanding of what has happened – but not necessarily of how the thinking that drives such parenting behavior came to be. Regardless, it is what it is, and all we can do is change our future behavior. Here are some cogent words of advice from Dr. Dweck (again from the Highlights article):

“Parents can also show children that they value learning and improvement, not just quick, perfect performance. When children do something quickly and perfectly or get an easy A in school, parents should not tell the children how great they are. Otherwise, the children will equate being smart with quick and easy success, and they will become afraid of challenges. Parents should, whenever possible, show pleasure over their children’s learning and improvement.”

“Parents should not shield their children from challenges, mistakes, and struggles. Instead, parents should teach children to love challenges. They can say things like “This is hard. What fun!” or “This is too easy. It’s no fun.” They should teach their children to embrace mistakes, “Oooh, here’s an interesting mistake. What should we do next?” And they should teach them to love effort: “That was a fantastic struggle. You really stuck to it and made great progress” or “This will take a lot of effort—boy, will it be fun.“

“Finally, parents must stop praising their children’s intelligence. My research has shown that, far from boosting children’s self-esteem, it makes them more fragile and can undermine their motivation and learning. Praising children’s intelligence puts them in a fixed mindset, makes them afraid of making mistakes, and makes them lose their confidence when something is hard for them. Instead, parents should praise the process—their children’s effort, strategy, perseverance, or improvement. Then the children will be willing to take on challenges and will know how to stick with things—even the hard ones.”

“I saw it with my own two eyes!” Does this argument suffice? As it turns out – “NO!” that’s not quite good enough. Seeing should not necessarily conclude in believing. Need proof? Play the video below.

As should be evident as a result of this video, what we perceive, can’t necessarily be fully trusted. Our brains complete patterns, fill in missing data, interpret, and make sense of chaos in ways that do not necessarily coincide with reality. Need more proof? Check these out.

Visual Illusion – A & B are the same shade of gray

Illusion – Notice the perceived motion around the green circles.

Convinced? The software in our brains is responsible for these phenomena. And this software was coded through progressive evolutionary steps that conferred survival benefits to those with such capabilities. Just as pareidolia confers as survival advantage to those that assign agency to things that go bump in the night, there are survival advantages offered to those that evidence the adaptations that are responsible for these errors.

So really, you can’t trust what you see. Check out the following video for further implications.

Many of you are likely surprised by what you missed. We tend to see what we are looking for and we may miss other important pieces of information. The implications of this video seriously challenge the value of eye witness testimony.

To add insult to injury you have to know that even our memory is vulnerable. Memory is a reconstructive process not a reproductive one.2 During memory retrieval we piece together fragments of information, however, due to our own biases and expectations, errors creep in.2 Most often these errors are minimal, so regardless of these small deviations from reality, our memories are usually pretty reliable. Sometimes however, too many errors are inserted and our memory becomes unreliable.2 In extreme cases, our memories can be completely false2 (even though we are convinced of their accuracy). This confabulation as it is called, is most often unintentional and can spontaneously occur as a result of the power of suggestion (e.g., leading questions or exposure to a manipulated photograph).2 Frontal lobe damage (due to a tumor or traumatic brain injury) is known to make one more vulnerable to such errors.2

Even when our brain is functioning properly, we are susceptible to such departures from reality. We are more vulnerable to illusions and hallucinations, be they hypnagogic or otherwise, when we are ill (e.g., have a high fever, are sleep deprived, oxygen deprived, or have neurotransmitter imbalances). All of us are likely to experience at least one if not many illusions or hallucinations throughout our lifetime. In most cases the occurrence is perfectly normal, simply an acute neurological misfiring. Regardless, many individuals experience religious conversions or become convinced of personal alien abductions as a result of these aberrant neurological phenomena.

We are most susceptible to these particular inaccuracies when we are ignorant of them. On the other hand, improved decisions are likely if we understand these mechanisms, as well as, the limitations of the brain’s capacity to process incoming sensory information. Bottom line – you can’t necessarily believe what you see. The same is true for your other senses as well – and these sensory experiences are tightly associated and integrated into long-term memory storage. When you consider the vulnerabilities of our memory, it leaves one wondering to what degree we reside within reality.

For the most part, our perceptions of the world are real. If you think about it, were it otherwise we would be at a survival disadvantage. The errors in perception we experience are in part a result of the rapid cognitions we make in our adaptive unconscious (intuitive brain) so that we can quickly process and successfully react to our environment. For the most part it works very well. But sometimes we experience aberrations, and it is important that we understand the workings of these cognitive missteps. This awareness absolutely necessitates skepticism. Be careful what you believe!

I just spent two weeks in Europe with my fellow adventurer and wife visiting the relics of times gone by. In the Louvre we peered upon works laid down well over two thousand years ago by Greek sculptors as well as by Roman, Middle Age, Byzantine, Gothic, Renaissance, and Baroque artists. We admired the Impressionists at Musée d’Orsay.

We then traveled to Venice, a city that blended Byzantine, International Gothic, Renaissance, and Baroque art and architecture in a way that is unique to this breathtaking city. It’s Eastern influences are palpable. Then on to Florence, the home of the Renaissance, which proved to be a showcase for the works of da Vinci, Botticelli, Titian, Michelangelo and many others.

When In Rome, we focused on the age of the Empire devoting our attention to the Colosseum, Palantine Hill, the Roman Forum, the Pantheon, and our day trip to Pompeii. We didn’t prioritize the treasures at the Vatican or the many other indoor sites. Between the Louvre, Orsay, Uffuzi, and the many works within the countless Basilicas and churches we had previously visited, we had had our fill of crowded indoor shrines. Here we largely delved into the out of doors. The Pantheon was far more striking than I had imagined. And Pompeii, wow! It has to be seen to be appreciated.

All this is relevant because although you can see it at home, it is just not the same. Go to Google Maps and search for Pompeii. You can tour the site using street view. Or get a book or watch Travel Channel or History Channel episodes on these great destinations. I guarantee it won’t be the same as seeing it in person, touching it, feeling it, or breathing it in in-vivo. No duh, right?

Well, what is it about seeing the “real thing?” Why was I moved to tears to see a statue of Galileo in Florence? Why was it exciting to walk the same basalt cobbles in the Roman Forum as historical figures such as Julius Caesar, Brutus, Marc Antony, and Augustus? Why were there throngs of people gathered around da Vinci’s Mona Lisa? All over Paris, Venice and Florence you could find “descent” replicas (prints and even posters) – yet these images gathered no lines.

The answer is essentialism. There is nothing on the streets left by these famous people that magically imbibes the stones with a quality that makes them somehow special. They don’t contain anything truly special at all. I absorbed nothing by touching them or by looking at da Vinci’s or Michelangelo’s original works. And my personal telescope is far more capable than any Galileo original. But it was very exciting to see two of the scopes that he himself had made.

I knew that there was an irrational magical quality to these experiences. I knew I was cognitively embellishing all the aforementioned relics; however, I was able to let go, and enjoy the emotional implications. I did, however, find myself less inclined to part with my few and precious Euros for sentimental mementos (made in China) to remember this trip by.

There is a learning curve to the application of Skeptism. Raw, unchecked challenges to other’s beliefs, in a social context, are not well tolerated. People tend to find such notions rather offputting. In fact, as I have certainly encountered, it elicits defensiveness and sometimes hurt feelings. People often own their ideas and beliefs in a way that is essentially linked to their identity. As Carl Sagan wrote in ‘The Deamon Haunted World’ “All of us cherish our beliefs. They are, to a degree, self-defining. When someone comes along who challenges our belief system as insufficiently well-based — or who, like Socrates, merely asks embarrassing questions that we haven’t thought of, or demonstrates that we’ve swept key underlying assumptions under the rug — it becomes much more than a search for knowledge. It feels like a personal assault.”

These assaults repel people and in effect insolate them from the rational inquiry you may wish to posit. People are inclined to respond to uninvited or poorly crafted skepticism much as one would respond to contemptuous arrogance.

Throughout most of human history, the social consequences of skeptical inquiry were likely quite costly. This was most certainly true in the preagrarian stages of our evolution. It is believed that throughout early human evolution individual survival was linked to social cohesion. Although this is not as true today, in prehistory skepticism likely hindered, rather than promoted survival. With this in mind, it certainly makes sense that we as a species are inclined toward unquestioning belief rather than skepticism. This inclination also makes us vulnerable to mysticism and superstition. Natural selection, it seems, has selected for gulibility.

Sensitive, judicious, and scant use of sketicism, in social contexts, is prudent. This is true unless you just don’t care about how others feel about you, how they feel about interacting with you, and even about how they feel about themselves. There is a time and place for everything. Choosing those times carefully and selecting one’s words even more cautiously will more likely get better results.

I admire great thinkers like Bruno, Coppernicus, and Galileo who faced more than mere social consequences for putting forward their theories. Bruno, in fact, paid with his life. Darwin too faced significant costs. However, their rejection of accepted explanations (stemming from skeptical inquiry) moved us forward. We owe much to these men for their courage and steadfast dedication to the truth. We move forward when we step away from blind acceptance; but, let’s not lend a blind eye toward the social consequences of our own personal skepticism.

I have devoted numerous posts to a general category of cognitive errors and biases that are broadly lumped into errors associated with the intuitive mind. The lay notions of intuition are often referred to as gut instincts and they are generally considered emotional and irrational responses. It is in this context that intuition is vilified. Such impulsive reactions are countered with teachings typified by adages such as: “Look before you leap;” “Don’t judge a book by its cover;” “Haste makes waste;” and “The hurrier you go the behinder you get.” Although this narrow understanding of intuition is in part correct, it largely misses the mark regarding this very complicated and sophisticated neuro-system. Intuition is largely misunderstood, and has frankly not been well understood to begin with. Herein I hope to offer a cursory explanation of intuition and broadly differentiate it from rational thought. The vast majority of the following content is drawn from Malcolm Gladwell’s intriguing 2005 book called ‘Blink: The Power of Thinking Without Thinking.’ Gladwell draws together a vast array of research from cognitive and social psychology and a number of other sciences in an attempt to elucidate this ambiguous concept.

Rational thought serves as a good starting place because, in fact, it offers a good point of comparison helping to bring intuition into slightly better focus. Reason is the hallmark of rational thought. It involves an active application of the cerebral cortex, whereby personal history, knowledge, and active cognitions are employed in a conscious manner to solve problems. The keywords here are active and conscious. When we engage in reasoning we are generally aware of the cognitive effort directed toward this process. Another aspect of relevance to this process is the passage of time. Reason-based thought is not generally instantaneous. Although solutions may seem to pop into awareness out of the blue, generally some measure of time passes as we strive for enlightenment. Think of an occasion where you had word finding difficulties. You probably actively thought about the word, the context of the word, and so on. If you failed to recall the word you may have cognitively moved on to something else, only to have the word come to you. The former was rational thought; the latter, the result of intuitive thought.

Intuition is different from rational thought with regard to those key variables. First, this instantaneous process is seemingly unconscious. Second, it is automatic (or at least seemingly so) consuming no apparent effort or time. The popular and scientific literature is replete with descriptive names for this seemingly mystical capacity. Gladwell uses a full complement of these terms and he sprinkles them throughout his text. Terms that emanate from the sciences include the adaptive unconscious, unconscious reasoning, rapid cognition, and thin slicing. Other descriptive terms include snap judgments, fast and frugal thinking, and eloquently the “mind behind the locked door.” Regardless of what we call it, intuition is constantly at work, drawing instantaneous conclusions outside of our awareness.

Because of the nature of this process, Gladwell notes that people are often ignorant of the secret decisions that affect their behavior, yet they do not feel ignorant. We often behave in manners driven by the adaptive unconscious and later try to justify those behaviors invoking the rational brain to do so. This fact is what calls into the question the reality of free will. Intriguing isn’t it! It is as though there is a covert super-powerful, super-fast computer running in tandem with our overt reasoning computer: yet outside our awareness this covert computer remains ever vigilant, soaking in the world through our senses, and actively directing our behavior.

Although the adaptive unconscious lies outside our direct control, life experiences, practice, and our intellectual pursuits contribute to the data set that is used when snap judgments are made. The more informed, erudite, and experienced one is, the more accurate one’s rapid cognitions become. Just think about driving. When learning to drive there are an overwhelming number of things to think about – so many in fact, that mistakes made are likely due to “analysis paralysis.” Too much to compute! Through practice and repetition, all those things we previously had to actively think about become more automatic. We don’t think about the countless micro adjustments we make on the steering wheel as we drive down the highway. Novice drivers must think about these adjustments, along with attending to their speed (generally with gross applications of the accelerator and brakes), and myriad other factors that seasoned drivers do not overtly contemplate. The novice’s driving is chunky – experienced drivers, with the benefit of many miles in the drivers seat, are generally more smooth and more refined in their driving.

Experts in their given fields become more intuitive or automatic with regard to their area of expertise over time as a result of exposure, learning, and practice. Their thoughts become seemingly automatic, their judgments and reactions more spontaneous – all of this in many situations without the expert even having to actively think. In these cases (where there is sufficient expertise) snap judgments can be even more accurate than the arduous process of working through problems rationally. On the other hand, this intuitive process can lead to problems because it is remarkably susceptible to prejudices and errors. This is particularly true, as you might surmise, in areas where the individual lacks experience or knowledge.

Under certain circumstances the adaptive unconscious serves our purposes very well. In addition to those situations where one’s expertise applies, we tend to effectively use snap judgments in social situations, in complicated situations, or in life or death situations that necessitate quick decisions. This is where evolution has played a role in shaping this capacity. It has had the effect of contributing to the survival of our species. He who can make effective snap judgments in life or death situations is more likely to pass on this very capacity. And tens of thousands of years of such natural selection has refined this capacity.

The catch is that there are erroneous thought processes that are artifacts, residuals or the direct consequence of the adaptive unconscious. Issues such as essentialism, pareidolia, and superstition fall into this category, as they have been ushered along with the survival advantage that the adaptive unconscious has conferred. Cognitive errors and biases hamper the effectiveness of the adaptive unconscious because of its inclination toward implicit associations and other accidental error imposing tendencies. Implicit associations are automatic and non deliberate pairings we make between concepts, people, things, etc., (e.g., African Americans are athletic, blonds are scatterbrained, gay men are effeminate) as they are folded into memory. This is an intriguing concept, one deserving its own post, but you have to take the Implicit Associations Test, particularly the race test, to get a true sense of this powerful bias. Confirmation bias, self serving bias, as well as the numerous other cognitive biases are likewise linked to this influential super-computer. However, just because we cannot directly and purposefully access this incredible system, does not mean we have to bow entirely to its influence. In fact, we can proactively prime this system through active learning. And we can be aware of this powerful system and the advantages and disadvantages it confers. We can learn of the errors it inclines us toward and monitor ourselves when it comes to our biases and prejudices. We can impose certain rules of thought when it comes to important issues. I believe that we all should take these very important steps both to make our intuitive brain more accurate and to buffer its influences in those situations where it is likely to lead us astray.

References:

Gladwell, M. (2005). ‘Blink: The Power of Thinking Without Thinking.’ New York: Little, Brown and Company.

Essentialism within the purview of psychology is a cognitive bias whose roots form in early childhood (Gelman, 2004). This concept pertains to the notion that all discernible objects harbor an underlying reality that although intangible, gives each and every object it’s true identity – it’s essence (Dawkins, 2009; Hood, 2008). To put it another way:

In our early childhood, as we were developing language, essentialism played a crucial role in the expansion of our vocabulary, the generalization of our knowledge, in discriminating among objects, and in our ability to construct causal explanations (Gelman, 2004). In our struggle to understand the vast and complicated world, our brain forced us to partition things into categories so we chopped and divided what we surveyed into distinct groupings based on defining characteristics driven by our internalized understanding of the essence of those groupings. This was initially a very simplistic process (dog, cat, cow), then more complex (mammal, reptile, insect), and then even more sophisticated for those who progressed in the biological sciences (kingdom, phylum, class, order, family, genus, species). This is necessarily a dynamic process because as we mature and take in increasing complexity we need increased specificity when parsing the world up into discrete categories.

This pattern of thinking/learning transcends all cultures and is central to our language development (Hood, 2008). Given this central role, it forms the foundation of our thought processes (Hood 2008; Dawkins, 2009). The overgeneralization of this process is what gets us into difficulty. Bruce Hood, author of Supersense (2008), convincingly argues that this innate tendency forms the core of our superstitious and supernatural thinking. Richard Dawkins (2009), an evolutionary biologist, suggests that such an inclination explains why people have such great difficulty grasping and accepting the concept of evolution by means of natural selection. I suggest, that like evolution (which necessitates quintessential anti-essentialist thinking), the concepts of plate tectonics, deep geological time, and deep space time are also very hard to grasp for the same reasons. We are inclined to think that what we see are constants – that the world as we see it has been eternally so, and so shall it always remain.

In biology, essentialism sustains the notion that all animals are clear and distinct, belonging to a specific species. In fact, as Dawkins suggests: “On the ‘population-thinking’ evolution view, every animal [living form] is linked to every other animal [living form], say rabbit to leopard, by a chain of intermediates, each so similar to the next that every link could in principle mate with its neighbors in the chain and produce fertile offspring” (2009, p. 24). This is true for all conceivable pairings including bacteria and viruses, giant sequoias and lichen, spiders and flies, cats and dogs, birds and snakes, foxes and chickens, and even humans and turnips.

Plato demonstrated essentialist thinking in The Republic in his cave allegory, where he suggested that the world as we experience it is only a composite of mere shadows tethered to their true and perfect forms (essences) floating about somewhere in the heavens (Dawkins, 2009; Hood, 2008). Many people still believe that there is something more to the physical world than what we see. As Hood (2008) put it, “Humans like to think that special things are unique by virtue of something deep and irreplaceable.” This thinking, and other intuitive errors such as vitalism (that vital life energies cause things to be alive) and holism (that everything is connected by forces) are likely artifacts of our natural involuntary inclinations (Hood, 2008).

Essentialism is more than a heuristic and it has ramifications beyond making us less inclined to believe in evolution or more inclined toward superstition. It is what makes rape more than a physical crime. The defilement and contamination the victim feels is a psychological violation of one’s essential integrity. Genocide is perpetrated by individuals who dehumanize or define the victims as essentially different and/or contaminated. Essentialism, is what makes original works of art more valuable than exact duplicates (Hood, 2008). It also drives the belief systems that sustain homeopathy.

It is interesting that this intuitive process plays such an important and fundamental role in our development and sustains both powerfully positive and hugely negative influences on us as adults. When you get right down to the essence of this concept, you must accept that these inclinations have their roots in the same thinking that makes a preschool child believe that a Mommy can’t be a firefighter (Gelman, 2004).

References:

Dawkins, R. 2009. The Greatest Show on Earth: The Evidence for Evolution. New York: Free Press.