Tuesday, April 27, 2010

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them! In this week’s installment, brought to you by the letter “N,” we discuss the need for closure, the neglect of probability, the “Not Invented Here” (NIH) Syndrome, and notational bias.

Need for Closure

On a scale of 1 (strongly disagree) to 6 (strongly agree), how would you rate yourself on the following statements?

I don't like to go into a situation without knowing what I can expect from it.

I think that having clear rules and order at work is essential for success.

I'd rather know bad news that stay in a state of uncertainty.

I usually make important decisions quickly and confidently.

I do not usually consult many different opinions before forming my own view.

These questions are part of the 42-item Need for Closure Scale (NFCS), a way to measure the extent of your need for cognitive closure, your desire for an answer to settle the matter, even if the answer isn’t the correct one or the best one.

There are five different types of the closure bias. The first statement above tests your desire for predictability. In order, the others test your preference for order and structure, your degree of discomfort with ambiguity, your decisiveness, and your close-mindedness.

If you have a high need for closure, you tend to rely more on information received earlier, and prefer the first workable answer you come across. You tend to search for information more narrowly, and apply rules and shortcuts to aid quick decision-making. A low need for closure is, unsurprisingly, associated with creativity, especially the process of coming up with a large number of potential solutions.

Need for closure is affected by outside circumstances as well as by basic temperament. Time pressure, in particular, plays a significant role. The need for closure is attributed not only to individuals, but also to cultures as a whole, illustrated by the argument that the “need for national closure” warranted stopping the process of recounting votes in the Florida 2000 presidential election.

Neglect of probability

Several of our cognitive biases involve misapplication or misunderstanding of probability in a given situation. So far, we’ve covered the base rate effect, the gambler’s fallacy, the hindsight bias, and the ludic fallacy.

Neglect of probability is something different. It’s the complete disregard of probability rather than its incorrect use. Children are particularly subject to this bias. In a 1993 study, children were asked the following question:

Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn’t. . . . Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?

Here’s how one subject responded:

A: Well, in that case I don’t think you should wear a seat belt.

Q (interviewer): How do you know when that’s gonna happen?A: Like, just hope it doesn’t!Q: So, should you or shouldn’t you wear seat belts?A: Well, tell-you-the-truth we should wear seat belts.Q: How come?A: Just in case of an accident. You won’t get hurt as much as you will if you didn’t wear a seat belt.Q: OK, well what about these kinds of things, when people get trapped?A: I don’t think you should, in that case.

Another subject replied, “If you have a long trip, you wear seat belts half way.” Notice that the comparative probability of the two events doesn’t come into the discussion at all.

For adults, a 2001 study found that a typical subject was willing to pay $7 to avoid a 1% chance of a painful electric shock, but only $10 to avoid a 99% chance of the same shock, suggesting that probability is more likely to be neglected when the outcomes produce anxiety.

“Not Invented Here” (NIH) Syndrome

It doesn’t take a lot of experience in the world of work before you begin to encounter the “Not Invented Here” syndrome. Although mostly intended as a somewhat cynical joke, the behavior is quite real and has a significant effect on organizations. Interestingly, it’s not always negative, and not always antithetical to creativity and innovation. Like many cognitive biases, the trick is to be conscious of how it works in your life and in your organization.

Numerous factors can trigger an NIH response. Personal and organizational egotism plays a large role: we are inherently superior or unique, therefore what may work elsewhere is inferior or inapplicable. Loyalty matters. In the early days of personal computing, the British-made Timex Sinclair was hugely popular in Britain but hardly known in the United States, and the Japanese/Dutch MSX computer was successful in Japan and much of Europe, but not in either Britain or the United States.

There can be economic advantages to NIH behavior. Television networks more commonly buy programs from suppliers in which they have a financial interest. Such shows are more profitable to the network than a show from a non-affiliated supplier that drew higher ratings. Economic advantages can also accrue to individuals at the same time they penalize organizations. In one case, a department refused to help another group in the same company because doing so would perversely lowered bonuses to those doing the helping.

NIH can also form the basis of corporate strategy, and as such can be a vehicle to promote innovation rather than retard it. Apple, for example, commonly ignores or actively denigrates trends in the computer industry and invents its own. “Netbooks aren’t better than anything,” argued Steve Jobs. “They’re just cheap laptops.” Accordingly, Apple ignored the netbook model and invented its own: the iPad.

Notational bias

BRITANNUS (shocked).

Caesar: this is not proper.

THEODOTUS (outraged).

How!

CAESAR (recovering his self-possession).

Pardon him. Theodotus: he is a barbarian, and thinks that the customs of his tribe and island are the laws of nature.

This famous moment from George Bernard Shaw’s Caesar and Cleopatra illustrates notational bias, the assumption that conventions of one’s own society are equivalent to laws of logic or of nature. Examples abound. If you read most European languages, you read from left to right. It’s “natural.” But if you read Hebrew, it’s the other way around. It’s “natural” for Americans to drive on the right. But of course these are ultimately arbitrary choices that become the norm for a particular culture.

When you fall into notational bias, it’s not about whether you prefer your culture’s choice, or even whether your culture’s choice is arguably better. Instead, notational bias blinds you to the idea that there’s even a choice to be made.

Monday, April 19, 2010

Do people always get what they deserve? Would you sooner get a $5 discount or avoid a $5 surcharge? If you get 99 heads in a row, are the odds of another head 50/50? Do you like the familiar? Is everything more expensive these days?

In this installment of Cognitive Biases, we'll cover the just-world phenomenon, loss aversion, the ludic fallacy, the mere exposure effect, and the money illusion.

The illustration is by Gustav Doré from the Book of Job.

Just-world phenomenon

“He must be wicked to deserve such pain,” wrote Robert Browning in “Childe Roland to the Dark Tower Came,” and indeed the idea that people get what they deserve, both for good and evil goes back through history. When Job was suffering, his friends Bildad, Zophar, and Eliphaz each argued that Job must have done something wrong, because God would not visit such terrible punishments on an innocent.

The cognitive bias known as the just-world phenomenon refers to the tendency of people witnessing an otherwise inexplicable injustice to look for reasons the victim might have deserved it. In theology, the problem of evil falling on the apparently innocent is known by the all-too-apt name of theodicy.

It’s been demonstrated scientifically as well. In one study, researchers gave women what appeared to be painful electric shocks while working on a difficult memory problem. Other women of broadly the same age and social group who observed the experiment appeared to blame the victim for her fate, praised the experiment, and rated her as being less physically attractive than did those who had seen her but not the experiment.

In another study, female and male subjects were told two versions of a story about an interaction between a woman and a man. Both variations were exactly the same, except at the very end the man raped the woman in one and in the other he proposed marriage. In both conditions, both female and male subjects viewed the woman's (identical) actions as inevitably leading to the (very different) results.

The rain, it is said, falls on the just and unjust alike. Don’t make negative assumptions about people you don’t even know.

Loss aversion

Would you sooner get a $5 discount, or avoid a $5 surcharge? It’s the same $5 either way, but depending on the frame, there’s a dramatic difference in consumer behavior. Some studies suggest that the value of avoiding a loss is psychologically twice as powerful as the value of a gain. In one study of consumer reaction to price changes to an insurance policy, a price increase had twice the effect on customer switching as did a price decrease.

Loss aversion also plays into “sunk cost” bias. If you’ve been gambling and you’re in the hole, it’s the tendency to keep playing in hopes of recovering the lost money. The refusal to admit mistakes is part of loss aversion. The more time and energy you’ve committed to a particular course of action, the harder it is to walk away from it, regardless of the evidence.

Ludic fallacy

If you’ve flipped a coin 99 times and gotten heads each time, what are the odds of getting heads on the next flip of the coin? We’ve already learned about the gambler’s fallacy, so we know the odds of the next flip coming up heads are still 50/50.

But wait a minute. If you’ve flipped a coin 99 times and gotten heads each time, wouldn’t you start to suspect there was something wrong with the coin? The ludic fallacy (a term coined by Nassim Nicholas Taleb in his 2007 book The Black Swan) is the assumption that messy situations in the real world fall neatly into the models of games and dice.

There’s a lot of value in simplifying a complex problem to identify core principles, but there’s a strong risk of believing the simple model is identical to the messy real world, and that’s wrong. Theory and models are subordinate to reality, not superior to it.

Mere exposure effect

People tend to develop a preference for things merely because they are familiar with them. In studies of interpersonal attraction, the more often a person is seen by someone, the more pleasing and likeable that person appears to be.

When subjects were exposed to an unfamiliar stimulus in laboratory experiments, they reacted to it more positively than other, similar stimuli which had not been presented. In one variation, subjects were shown an image on a tachistoscope for a very brief duration that could not be perceived consciously. This subliminal exposure produced the same effect, though it is important to note that subliminal effects are generally weak and unlikely to occur without controlled laboratory conditions.

The effect is strongest when unfamiliar stimuli are presented briefly. Mere exposure typically reaches its maximum effect within 10-20 presentations, and some studies even show that liking may decline after a longer series of exposures. For example, people generally like a song more after they have heard it a few times, but many repetitions can reduce this preference. A delay between exposure and the measurement of liking actually tends to increase the strength of the effect. Curiously, the effect is weaker on children, and for drawings and paintings as compared to other types of stimuli. One social psychology experiment showed that exposure to people we initially dislike makes us dislike them even more.

Money illusion

“I asked for a three-penny loaf,” wrote Benjamin Franklin about his first day in Philadelphia in 1723, “and was told they had none such. So not considering or knowing the difference of money, and the greater cheapness nor the names of his bread, I made him give me three-penny worth of any sort. He gave me, accordingly, three great puffy rolls. I was surpriz’d at the quanity, but took it, and, having no room in my pockets, walk’d off with a roll under each arm, and eating the other.”

When this story was first presented to me in school, the teacher and students discussed how cheap bread must have been in those days. A loaf for a penny! But, of course, that’s not true. The average weekly wage was about a dollar, meaning a penny represented about half an hour’s worth of work. Today, the median personal income for a 25 year old with a bachelor’s degree is about $50,000, and (I just checked) you can buy a loaf of white bread for $1.00 at the local store. That means bread costs about 3 minutes worth of work, or a tenth as much as Benjamin Franklin paid.

Has gas gotten more expensive? In 1958, gas cost 24¢ a gallon, but that’s $2.24 in current terms. How about postage? In real terms, a first class stamp today costs less than it did in the 1940s, when it hit an inflation-adjusted spike of 51¢ (4¢).

The face value of money isn’t as important as its purchasing power, but psychologically, people don’t believe it. If you get a 2% pay cut, it’s unfair and hugely damaging to morale. But if inflation is 4% and you get a 2% raise, you’re in exactly the same position, but you’re more likely to think you’re being treated well.

Tuesday, April 13, 2010

Today, Tuesday, April 13, 2010, is the 35th anniversary of a killing spree in Wheaton, Maryland. My girlfriend and I were on our way home from Young Frankenstein when we drove right through the middle of it.

Michael Edward Pearch shot seven people, all African-American, killing two and wounding the rest. There were indications, police said, that the shooting were racially motivated. All the victims were black and the gunman was white. He passed up at least one car with whites, said police, as he walked down a highway looking for another target.

There were at least two such cars. One of them was mine.

Here’s the story.

Pearch, an unemployed carpenter living with his mother in Silver Spring, Maryland, left home about 7:30p on Sunday, April 13, 1975, and drove to the nearby Wheaton Plaza shopping mall. He was wearing his Army fatigues, a knapsack with 250 rounds of ammunition, and a machete strapped to his chest. He carried a .45 caliber automatic pistol.

He walked to the traffic light at the entrance ot the mall, where he shot and killed John L. Sligh, 43, of Rockville, Maryland, and wounded his wife, Laureen D. Sligh, 40, in both legs. He walked to the next car and fired at Dr. Ralph C. Gomes, also of Rockville, but missed. Gomes swerved, crashing his car into another. He suffered minor injuries.

The panic started at once. “Some witnesses ducked for cover. Others just stood there and watched in disbelieving shock,” said police captain Miles Daniels. One particularly brave man (I don’t know his name) called the police and began following the gunman.

We were on our way home from the movies. It was a warm spring evening. The car windows were open. As I neared the intersection of Georgia Avenue and University Boulevard (a major intersection), I heard what I thought at first were gunshots.

But gunshots on a lazy Sunday evening on a busy suburban street? Surely, I must be imagining things. Then I saw the man who had followed the gunman. He was ducking behind cars. Well, if there wasn’t any gunfire, then surely the man was just playing some sort of game.

The light turned green. I pulled forward. As I reached the intersection, I saw two men in the left turn lane on the other side of the street. One man was standing. He was white. One man was face down. He was black. In his right hand, he was carrying a brown paper bag.

If there wasn’t any gunfire, and the man ducking behind cars was playing some sort of game, then I figured I was looking at some drunks, with one of them (clutching his booze in a brown paper bag) passed out in the street.

As I drove through the intersection, I passed within five feet of Michael Edward Pearch, the shooter, and his most recent victim, Harold S. Navy, Jr., 17 years old and a freshman at the University of Maryland. Navy was working as a busboy at the Anchor Inn, right on the corner of Georgia and University. He had been sent across the street to a supermarket to buy a jar of applesauce, the contents of that brown paper bag. He was wounded in the abdomen, but survived.

There was a police station about a mile north of the intersection, right on our way home, so I pulled in. “There’s a drunk passed out in the left turn lane at Georgia and University,” I told the officer at the desk.

“Wait here,” the officer said.

Moments later three plainclothes officers came out of the back room. “Are those the eyewitnesses to the murders?” one of the officers asked.

It was not until that moment that I had any idea what I had seen.

We spent the rest of the evening in a room with an increasing number of witnesses. It wasn't until afterward that I learned the rest of the story.

Walking up Georgia Avenue, the gunman shot and killed Connie L. Stanley, 42, of Washington, DC, and then shot and wounded Rosalyn Stanley, 26, of Annapolis, who was in the next car.

Two policemen spotted the shooting and ordered Pearch to halt. He turned, looked at the officers, then walked to the next car with African-Americans and fired again, wounding Bryant Lamont Williams, 20, of Rockville. The two officers opened fire with a shotgun and a pistol, and killed Pearch.

“He was smiling. I thought he had been shooting blanks,” said William Painter, one of the 40-50 witnesses.

* * *

Some of my interest in cognitive biases and perceptual distortions stems from this incident. Eyewitness testimony, experts know, is not particularly reliable, especially from people not trained in the art. I was within five feet of the murderer, but couldn’t have picked him out of a lineup on a bet. I had no idea what was going on. I am still ashamed.

Selective perception underlies a lot of cognitive biases. We adjust and filter the world around us according to our sense of what the world should be, and therefore miss a lot about what the world really is.

I’ve been trying to get the details of this story for a long time, but even in a Google world, it’s hard to find. Perhaps it’s the small number of victims, but this particular incident doesn’t show up on any list of racial violence I can find. There’s a United Press article that appeared in various papers, ranging from the Fort Scott (Kansas) Tribune to the St. Petersburg (Florida) Evening Independent.

Michael Edward Pearch is mentioned as a potential suspect in the Wheaton abduction of the Lyon sisters, but there’s no evidence other than his killing spree to link him to the murders, and the Lyons were white. He’s also mentioned on at least one white supremicist site, where he’s a hero.

On the anniversary of this terrible event, I remember the victims, and remember also the lessons of my own failure to perceive what was going on all around me.

Tuesday, April 6, 2010

If you are failing seven times out of ten, most of us would take that as nature’s way of suggesting a different career field. But for a major league baseball player, that’s a .300 batting average – millions of dollars in salary and endorsements, and a shot at the Hall of Fame.

Most of us know that Babe Ruth also held the major league all-time strikeout record. Some of us know Charles Lindbergh’s other record – he’s tied for the most emergency parachute bailouts of all time. His safety record was so poor that the first company he approached to sell him a plane to cross the Atlantic refused him as a customer.

Success writers often tell these stories to give you hope. Yes, you may be a loser, but so were all these other people, and look how they turned out. But that misses the point:

If you are batting 1.000, one thing’s for sure – you’re not playing in the major leagues.

Successful people frequently have a shocking track record of failures and reverses on their way to the show. It’s what we call “paying your dues.” The things that matter have risks associated with them success isn’t free. Losing is an essential part of winning.

The baseball theory of life is very simple. If you don’t swing, you can’t hit. If you do swing, the odds are against you. For some people, that’s a deeply disturbing idea. But you can also look at the baseball theory with great hope: failure isn’t the long, circuitous road to the top; it’s the only way to the top.

In earlier entries, we’ve talked about the concept of risk (R=PxI, or the value of a risk equals the probability of the event times the impact of the event if it happens).

In pure risk (threat only), you can lower your risk by reducing the probability or by reducing the impact. In business risk (threat and opportunity combined, as in an investment decision), you have four ways to improve the situation: reduce the probability or impact of the downside, or increase the probability or impact of the upside.

When it comes to applying the baseball theory, where the odds of success are always low, the big trick is to reduce the consequences of failure. The cheaper it is to fail, the smarter it is to take a chance.

When salespeople cold-call clients, they expect rejection far more often than success, but one success pays the bills for a hundred rejections. People who’ve known me for a long time know I’ve always got some sort of scheme going. Most fail — I doubt I bat better than .200 — but I know how to keep the cost down so that the occasional winner is good enough.

In other words, it’s not whether you win or lose — it’s how you structure your bets.

Subscribe To SideWise Thinking

Michael Dobson

About Me

Michael Dobson is the author of over 60 books on leadership, project management, fiction, and history. A former researcher at the Smithsonian Institution and head of game design for TSR, Inc., Dobson's wide-ranging interests include science, science fiction, history, and much more.

THE STORY OF A SPECIAL DAY: What happened on your birthday? In this series of (eventually) 366 books, learn the true story about every day of the year! Click the link to read more about it — and visit Dobson's Improbable History every day for the latest on this day in history!