What are cognitive biases?Most people think they behave rationally&lt;This is the premise that most of current economic theory is bases&gt;It turns out that people are susceptible to limitations in thinking, judgement and decision making&lt;and that on the most part we are completely unaware of it&gt;Stems from several areas of cognitionMemoryPerceptionFeelingsMisapplication of statistical reasoning&lt;There are lots more…&gt;Introduced in 1972 by Amos Tversky and Daniel KahnmanThe list of identified cognitive biases runs into the hundreds

Where do the biases come from?At some point in our evolutionary history these biases were useful adaptationsMostly they help make decisions with limited information&lt;Heuristics, rules of thumb&gt;Useful if your only task for the day is survival, and speed of decision is more important that accuracy&lt;If you think you are going to get eaten, it is better to act quickly than enter into a thorough analysis of the situation&gt;However in today’s world the opposite is true&lt;The world we live in today is incredibly complex, and now in most cases it is better to be accurate than fast in our decision making&gt;Now we need to do more than just survive they can get us into trouble&lt;They can in the most serious cases get us killed&gt;Parts of our lives and society rely on sound decisionsFinding a partnerMaking a purchaseJuries!!!

&lt;I have just picked a few interesting examples, they are not necessarily the most common or have the largest impact on our lives&gt;Some examples – Confirmation BiasWe like things that match our view of the world&lt;We like people who are like us and share our interests. This is likely to come from our evolutionary need to form social groups&gt;To the extent that we search out things that agree with us, whilst ignoring conflicting information&lt;It is possible for two people to interpret the same information differently depending on their world view&gt;&lt;Fitting the terrain to the map rather than the other way around&gt;It is a short circuit to keeping away from things that may cause us harm&lt;After all, if things are similar to things we already know and like then they are probably alright&gt;But…Makes it difficult to let go of entrenched positionsMakes people open to scams such as psychic readings&lt;What other examples?&gt;The uncomfortable feeling of anxiety we get when the real world does not fit in with out world view is called cognitive dissonance.&lt;This is an actual physiological response to a psychological feeling&gt;

Fundamental attribution errorA quirk in the way we reason about causalityAttributing an aspect of a persons behaviour to their fundamental character rather than the situation If I loose my keys I am just unlucky, if someone else looses their keys they are careless&lt;They would see things the opposite way around&gt;What other examples are there?&lt;We attribute out own success to our hard work and skill, and other people’s success to luck&gt;&lt;When I don’t write tests I have a good reason for doing so, if someone else doesn’t write tests they are a bad developer&gt;

FramingPeople react differently to a choice depending on how the information is framedLoss versus gainPositive versus negativeIs it better to say a medical intervention has 90% chance of success or a 10% chance of death?It depends on what you desire as an outcome&lt;Do I want people to take the intervention or not?&gt;This bias is the cornerstone of marketing and political spin&lt;Has a very powerful effect on our decision making abilities&gt;

AnchoringHumans tend to rely on the first piece of information offered when making decisions. This is the anchor.Subsequent decisions are made by adjusting away from the anchorSo in negotiations the first price offered sets the anchor&lt;Are anchoring effects present in estimation?&gt;Related to a similar effect called priming&lt;Priming is subtly or no so subtly providing information that will focus people towards a decision, stance or worldview&gt;&lt;So in sprint planning when I say a story “feels small” I am priming (subconsciously or consciously) towards a biased estimate&gt;

Illusionary SuperiorityPeople tend to overestimate their positive qualities relative to other people or groupsThis is widespread in all aspects of lifeIntelligence, sporting ability, academic performance, job performance, popularity, confidence, …We even struggle to understand that in most cases it is a nonsensical premise&lt;unless the distribution is very skewed e.g. average number of legs for a human. In cases like this the median is a more suitable measure&gt;Even at Cambridge around half of the students academic abilities are below average for their cohortThis is linked to the Dunning-Kruger effect&lt;People who have a lack of knowledge tend to overestimate their abilities, often to the detriment of their more able peers&gt;

Survivorship BiasIt is easy to see the things that survived – they are all around usWe often overlook the things that didn’t because they are not visibleCould be actual people – medical trails not taking into account the people that dropped outFocusing on what leading businesses / business leaders did rather than understanding what the countless other who failed did or did not doThe bomber problem was real issue faced during WWII &lt;involved a statistician named Abraham Wald&gt; http://youarenotsosmart.com/2013/05/23/survivorship-bias/

Sammelweis ReflexThis is the dismissal of new evidence because it does not comply with the established norms of the dayNamed after Ignaz Sammelweis, who found a causal link between childbed fever and the mortality rates of new mothersDemonstrated washing hands could reduce death ratesIgnored by his peers as he could find no acceptable scientific explanation and his contemporaries simple refused to believe him&lt;He was driven mad by his desire to have his theory taken seriously. Died in an asylum&gt;Does this happen a lot in IT?&lt;Dismissal of new approaches, technologies and techniques?&gt;

Regression to the meanNot exactly a cognitive bias, but a similar failure in statistical reasoningIf a variable is extreme in the first measurement, it will tend to be closer to the average for the second measurementGives rise to the idea that punishment is effectiveIf someone performs badly and is punished it is more likely that the next time their performance will be closer to the averageLikewise with praise&lt;It is not that punishment works and praise fails, it is just statistics&gt;&lt;what about things like improvements in agile, or the effectiveness of planning meetings etc?&gt;&lt;Can be hard to separate actual improvements from regression to the mean&gt;

So now we know about them we are ok?

Transcript of "Cognitive biases"

1.
COGNITIVE BIASES

2.
What are cognitive biases?
• Most people think they behave rationally
• It turns out that people are susceptible to limitations in
thinking, judgement and decision making
• Stems from several areas of cognition
• Memory
• Perception
• Feelings
• Misapplication of statistical reasoning
• …
• Introduced in 1972 by Amos Tversky and Daniel
Kahneman
• The list of identified cognitive biases runs into the hundreds

3.
Where do the biases come from?
• At some point in our evolutionary history these biases
were useful adaptations
• Mostly they help make decisions with limited information
• Useful if your only task for the day is survival, and speed of
decision is more important that accuracy
• However in today’s world the opposite is true
• Now we need to do more than just survive they can get us
into trouble
• Parts of our lives and society rely on sound decisions
• Finding a partner
• Making a purchase
• Juries!!!
• …

4.
Some examples – Confirmation Bias
• We like things that match our view of the world
• To the extent that we search out things that agree with us,
whilst ignoring conflicting information
• It is a short circuit to keeping away from things that may
cause us harm
• But…
• Makes it difficult to let go of entrenched positions
• Makes people open to scams such as psychic readings
• The uncomfortable feeling of anxiety we get when the real
world does not fit in with out world view is called cognitive
dissonance.

5.
Fundamental attribution error
• A quirk in the way we reason about causality
• Attributing an aspect of a persons behaviour to their
fundamental character rather than the situation
• If I loose my keys I am just unlucky, if someone else
looses their keys they are careless
• What other examples are there?

6.
Framing
• People react differently to a choice depending on how the
information is framed
• Loss versus gain
• Positive versus negative
• Is it better to say a medical intervention has 90% chance
of success or a 10% chance of death?
• It depends on what you desire as an outcome
• This bias is the cornerstone of marketing and political spin

7.
Anchoring
• Humans tend to rely on the first piece of information
offered when making decisions. This is the anchor.
• Subsequent decisions are made by adjusting away from
the anchor
• So in negotiations the first price offered sets the anchor
• Related to a similar effect called priming

8.
Illusionary Superiority
• People tend to overestimate their positive qualities
relative to other people or groups
• This is widespread in all aspects of life
• Intelligence, sporting ability, academic performance, job
performance, popularity, confidence, …
• We even struggle to understand that in most cases it is a
nonsensical premise
• Even at Cambridge around half of the students academic abilities
are below average for their cohort
• This is linked to the Dunning-Kruger effect

9.
Survivorship Bias
• It is easy to see the things that survived – they are all
around us
• We often overlook the things that didn’t because they are
not visible
• Could be actual people – medical trails not taking into account the
people that dropped out
• Focusing on what leading businesses / business leaders did rather
than understanding what the countless other who failed did or did
not do
• This bias was faced for real during WWII when
investigating the cause of bomber losses

10.
Sammelweis Reflex
• This is the dismissal of new evidence because it does not
comply with the established norms of the day
• Named after Ignaz Sammelweis, who found a causal link
between childbed fever and the mortality rates of new
mothers
• Demonstrated washing hands could reduce death rates
• Ignored by his peers as he could find no acceptable
scientific explanation and his contemporaries simple
refused to believe him
• Does this happen a lot in IT?

11.
Regression to the mean
• Not exactly a cognitive bias, but a similar failure in
statistical reasoning
• If a variable is extreme in the first measurement, it will
tend to be closer to the average for the second
measurement
• Gives rise to the idea that punishment is effective
• If someone performs badly and is punished it is more likely that the
next time their performance will be closer to the average
• Likewise with praise

12.
So now we know about them we are ok?
• Well, not so much
• Knowing about them does not mean that you will be able
to spot them all the time. It does help.
• If a decision is important you must explicitly call out the
biases to make sure you are not being tricked