The Invisible Gorilla experiment

In this experiment, people were asked to watch a video of two teams playing basketball, one with white shirts versus one with black shirts (click to see Invisible Gorilla experiment). The viewers of the film need to count the number of passes made by members of the white team and ignoring the players wearing black.

This task is difficult and absorbing, forcing participants to focus on the task. Halfway through the video, a gorilla appears, crossing the court, thumps its chest and then continues to move across and off the screen.

The gorilla is in view for nine seconds. Fifty percent, half, of the people viewing the video do not notice anything unusual when asked later (that is, they do not notice the gorilla). It is the counting task, and especially the instruction to ignore the black team, that causes the blindness.

While entertaining, there are several important insights from this experiment

One important insight is that nobody would miss the gorilla if they were not doing the task. When you are focusing on a mentally challenging task, which can be counting passes or doing math or shooting aliens, you do not notice other actions nor can you focus on them.

A second insight is we do not realize the limitations we face when focused on one task. People are sure they did not miss the gorilla. As Kahneman writes, “we are bind to our blindness.”

System 1 and System 2

The Invisible Gorilla also serves as a framework to understand the two systems people use to think. System 1 operates automatically and quickly, with liitle or no effort and no sense of voluntary control. An example of System 1 thinking would be taking a shower (for an adult), where you do not even think about what you are doing.

System 2 thinking is deliberate, effortful and orderly, slow thinking. System 2 allocates attention to the effortful mental activities that demand I, including complex computations. The operations of System 2 are often associated with subjective experience of agency, choice, and concentration. The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away .

The automatic operation of System 1 generates surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.

Implications

Understanding System 1 and System 2 has several implications. First, if you are involved in an activity requiring System 2 thought, do not try to do a second activity requiring System 2 thought. While walking and chewing bubble gum are both System 1 for most people and can be done simultaneously, negotiating a big deal while typing an email are both System 2 and should not be done at the same time.

Second, do not create products that require multiple System 2 actions concurrently. While System 2 is great for getting a player immersed in a game, asking them to do two concurrently will create a poor experience. A third implication is when onboarding someone to your product, only expose them to one System 2 activity at a time.

Example from our world, Urbano’s Failed App

I like to use examples from the game space to illustrate how understanding Kahneman and Tversky’s work can impact your business. In this example, Urbano runs product design for a fast growing app company at the intersection of digital and television. He has built a great sports product that allows players to play a very fun game while watching any sporting activity on television. Unfortunately, Urbano’s company is running out of funds and the next release needs to be a hit or else they will not survive. Although the product has tested well, Urbano is nervous because of the financial situation and decides to add more to the product, to make the app based on what happens the past three minutes during the televised match. They launch the app and although players initially start playing, they never come back and the product fails.

Another company buys the rights to the product and conducts a focus test. They find out users forgot what happened on television because they were focusing on the app and then could not complete the game. They take out the part requiring attention to the televised match and the product is a huge success. The difference was that the latter did not require multiple System 2 thinking simultaneously, it left television watching as a System 1 activity.

Key Takeaways

In a famous experiment, people watching a basketball game who had to count passes one team made missed the appearance of a gorilla on the video. The experiment showed when you are focusing on something, you do not notice what else is happening.

We are blind to things in the background. We are blind to our blindness. In the Invisible Gorilla experiment, not only did people not see the gorilla, they refused to believe that they missed a gorilla.

There are two types of mental activities, System 1 that are automatic and reflexive, and System 2, that requires deliberate, effortful and orderly thinking.

As promised last month, I will spend a few blog posts summarizing Thinking, Fast and Slow by Daniel Kahneman. Before diving into the heuristics, biases and processes that he and his colleague Amos Tversky identified, it is important to understand why he wrote the book and why it is so useful. Fortunately, he largely does this in his introduction so it is a great place to start.

Let’s not beat ourselves up but make ourselves better

First, Kahneman points out that the goal of his research is not to prove we are idiots but to help us minimize bad decisions. Understanding flaws in human decision-making is no more insulting or denigrating than writing about diseases in a medical journal belittles good health. Rather, our decision-making is generally quite good, most of our judgments are appropriate most of the time, but there are systemic biases that if we understand can make our decision-making more effective.

By understanding Kahneman’s work, you will be better able to identify and understand errors of judgment and choice, in others and then yourself. As Kahneman points out, “an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.”

Is our judgment flawed?

At its roots, Kahneman began his career trying to determine if people consistently made biased judgments. Long story short, we do.

One example drove this determination home to Kahneman and Tversky and probably will to you also. Early in their collaboration, both Kahneman and Tversky realized they made the same “silly” prognostications about careers that toddlers would pursue when they became adults. They both knew an argumentative three year old and felt it was likely that he would become a lawyer, the empathetic and mildly intrusive toddler would become a psychotherapist and the nerdy kid would become a professor. They, both smart academics, neglected the baseline data (very few people became psychotherapists, professors or even lawyers compared with other professions) and instead believed the stories in their head about who ended up in what careers was accurate. This realization drove their ensuing research, that we are all biased.

How understanding Kahneman’s work impacts the world

The broad impact of Kahneman and Tversky’s work drives home it’s importance to everyone. When they published their first major paper, it was commonly accepted in academia that:

People are generally rational, and their thinking is normally sound

Emotions such as fear, affection and hatred explain most of the occasions on which people depart from rationality

Not only did these two assumptions drive academia (particularly economics and social sciences) but also their acceptance often drove business and government decisions. The work laid out in Thinking, Fast and Slow, however, disproved these two assumptions and thus drove entirely different decisions to generate strong results.

Scholars in a host of disciplines have found it useful and have leveraged it in other fields, such as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy. Kahneman cites an example from the field of Public Policy. His research showed that people generally assess the relative importance of issues by the ease with which they are retrieved from memory, and this is largely determined by the extent of coverage in the media. This insight now drives everything from election strategy to understanding (and countering) how authoritarian regimes manipulate the populace.

Kahneman and Tversky were also careful to ensure the subject of their experiments were not simply university students. By using scholars and experts as the subject of their experiments, thought leaders gained an unusual opportunity to observe possible flaws in their own thinking. Having seen themselves fail, they became more likely to question the dogmatic assumption, prevalent at the time that the human mind is rational and logical. I found the same myself and am confident that you will also. The idea that our minds are susceptible to systematic errors is now generally accepted.

Why it is called Thinking, Fast and Slow

While Kahneman and Tversky’s early work focused on our biases in judgment, their later work focused on decision-making under uncertainty. They found systemic biases in our decisions that consistently violated the rules of rational choice.

Again, we should not discount our decision-making skills. Many examples of experts who can quickly make critical decisions, from a chess master who can identify the top 20 next moves on a board as he walks by to a fireman knowing what areas to avoid in a burning building, experts often make critical decisions quickly.

What Kahneman and Tversky identified, though, is that while this expertise is often credited with good decision making, it is more of retrieving information from memory. The situation serves as a cue or trigger for the expert to retrieve the appropriate answer.

This insight helps us avoid a problem where our experience (which we consider intuition) does not actually help but hinders. In easy situations, intuition works. In difficult ones, we often answer the wrong questions. We answer the easier question, often without noticing the substitution.

If we fail to come to an intuitive solution, we switch to a more deliberate and effortful form of thinking. This is the slow thinking of the title. Fast thinking is both the expert and heuristic.

Example from our world, The Allan Mistake

Many of my readers are experienced “experts” from the mobile game space, so I will start with a hypothetical example that many of us can relate to. In this example, Allan is the GM of his company’s puzzle game division. He has been in the game industry over twenty years and has seen many successful and failed projects. The CEO, Mark, comes to Allan and says they are about to sign one of three celebrities to build a game around.

Allan knows the demographics of puzzle players intimately and identifies the one celebrity who is most popular with Allan’s target customers. Nine months later they launch the game and it is an abysmal failure. Allan is terminated and wonders what he did wrong.

Allan then looks over his notes from when he read Thinking, Fast and Slow, and realizes his fundamental mistake. When Mark came to him and asked which celebrity to use, Allan took the easy route and analyzed the three celebrity options. He did not tackle the actual question, whether it was beneficial to use a celebrity for a puzzle game and only if that was positive to pick between the three. If he had answered the more difficult question (difficult also because it would have set him against Mark), he would have found that celebrity puzzle games are never successful, regardless of the celebrity. Although it may have created tensions at the time with Mark, he probably would have been given an opportunity to create a game with a higher likelihood of success and still be in his position.

Key takeaways

Our decision making is not bad but by understanding our systemic biases we can be more efficient.

Understanding that people are not regularly rational and that this irrationality is not driven by emotion allows us to make better decisions in fields as diverse as medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics and military strategy.

Fast thinking refers to quick decisions and judgments based on our experience while slow thinking is the analysis of difficult questions.

I recently finished Michael Lewis’ most recent book, The Undoing Project: A Friendship that Changed the World and it motivated me to revisit Daniel Kahneman’s Thinking, Fast and Slow. Lewis’ book describes the relationship between Daniel Kahneman and Amos Tversky, two psychologists whose research gave birth to behavioral economics, modern consumer behavior theory and the practical understanding of people’s decision making. He explains the challenges they faced and the breakthroughs that now seem obvious.

As I mentioned, The Undoing Project reminded me how important Kahneman’s book was, probably the most important book I have ever read. It has helped me professionally, both understand consumer behavior and make better business decisions. It has helped me in my personal life, again better decision making in everything from holiday choices to career moves. It helps even to explain the election of Donald Trump or how the situation in North Korea has developed.

In the Undoing Project, two things drove home the importance of Kahneman’s work. First, despite being a psychologist, Kahneman won the Nobel Prize for Economics in 2002. It is difficult enough to win a Nobel Prize (I’m still waiting for the call), but to do it in a field that is not your practice is amazing. The second item that proved the value of Kahneman’s (and his colleague Amos Tversky) work was the Linda Problem. I will discuss this scenario later in this post, but the Linda Problem proved how people do not make rational decisions, myself included. It convinced the mainstream that people, including doctors and intellectuals, consistently made irrational decisions.

Despite the value I derived from Thinking, Fast and Slow, I never felt I learned all I could from it. I found it very difficult to read, the exact opposite of a Michel Lewis book, and did not digest all the information Kahneman provided. Even when I recommended the book to friends, I often caveat the recommendation with a warning it will be hard to get through.

Given the importance of Kahneman’s work and the challenge I (and probably others) have had in fully digesting Thinking, Fast and Slow, I will be writing a series of blog posts, each one summarizing one chapter of Kahneman’s book. I hope you find it as useful as I know I will.

The Linda Problem

As discussed above, the Linda Problem is the research by Kahneman and Tversky that largely proved people thought irrationally, or at least did not understand logic. While I normally like to paraphrase my learnings or put them into examples relevant for my audience, in this case it is best to show the relevant description from The Undoing Project, as the Linda Project was a scientific study that I do not want to misrepresent:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Linda was designed to be the stereotype of a feminist. Danny and Amos asked: To what degree does Linda resemble the typical member of each of the following classes?

Linda is a teacher in elementary school.

Linda works in a bookstore and takes Yoga classes.

Linda is active in the feminist movement.

Linda is a psychiatric social worker.

Linda is a member of the League of Women voters.

Linda is a bank teller.

Linda is an insurance salesperson.

Linda is a bank teller and is active in the feminist movement.

Danny [Kahneman] passed out the Linda vignette to students at the University of British Columbia. In this first experiment, two different groups of students were given four of the eight descriptions and asked to judge the odds that they were true. One of the groups had “Linda is a bank teller” on its list; the other got “Linda is a bank teller and is active in the feminist movement.” Those were the only two descriptions that mattered, though of course the students didn’t know that. The group given “Linda is a bank teller and is active in the feminist movement” judged it more likely than the group assigned “Linda is a bank teller.” That result was all that Danny and Amos [Tversky] needed to make their big point: The rules of thumb people used to evaluate probability led to misjudgments. “Linda is a bank teller and is active in the feminist movement” could never be more probable than “Linda is a bank teller.” “Linda is a bank teller and active in the feminist movement” was just a special case of “Linda is a bank teller.” “Linda is a bank teller” included “Linda is a bank teller and activist in the feminist movement” along with “Linda is a bank teller and likes to walk naked through Serbian forests” and all other bank-telling Lindas.

One description was entirely contained by the other. People were blind to logic. They put the Linda problem in different ways, to make sure that the students who served as their lab rats weren’t misreading its first line as saying “Linda is a bank teller NOT active in the feminist movement.” They put it to graduate students with training in logic and statistics. They put it to doctors, in a complicated medical story, in which lay embedded the opportunity to make a fatal error of logic. In overwhelming numbers doctors made the same mistake as undergraduates.

The fact that almost everyone made the same logic mistakes shows how powerful this understanding is. It proves that our judgment, and thus decision making, is often not logical but does contain flaws. This understanding helps explain many things in life and business that sometimes do not seem to makes sense.

The implications

Once you understand how our judgment is biased, it can help you make better decisions. It can also provide insights into how your customers view different options and why people behave as they do. In future posts, I will explore all of Kahneman and Tversky’s major findings and how they apply.

Key Takeaways

In the Undoing Project, Michael Lewis writes about the relationship and research of Daniel Kahneman and Amos Tversky, two psychologists who changed the way we understand decision making

The Linda Problem proved to the non-believers that people made illogical judgments. When given a story about a fictional person and then potential careers for that person, virtually everyone (from students to very successful professionals) chose a persona that was a subset of a broader persona, thus impossible that the former was more likely.

By understanding how people make judgments and decisions, we can improve our own decision making process and better understand our friends, family and customers.

Get my book on LTV

Understanding the Predictable delves into the world of Customer Lifetime Value (LTV), a metric that shows how much each customer is worth to your business. By understanding this metric, you can predict how changes to your product will impact the value of each customer. You will also learn how to apply this simple yet powerful method of predictive analytics to optimize your marketing and user acquisition.

Follow Lloyd's Blog via Email

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 992 other followers

Lloyd Melnick

This is Lloyd Melnick’s personal blog. I am EVP Casino at VGW, where I lead the Chumba Casino team. I am a serial builder of businesses (senior leadership on three exits worth over $700 million), successful in big (Disney, Stars Group, Zynga) and small companies (Merscom, Spooky Cool Labs) with over 20 years experience in the gaming and casino space.