8 Subconscious Mistakes Our Brains Make Every Day -- And How to Avoid Them

I was seriously shocked at some of these mistakes in thinking that I subconsciously make all the time. Obviously, none of them are huge, life-threatening mistakes, but they are really surprising, and avoiding them could help us to make more rational, sensible decisions.

Especially as we thrive for continuous self-improvement at Buffer, if we look at our values, being aware of the mistakes we naturally make in our thinking can make a big difference in avoiding them. Unfortunately, most of these occur subconsciously, so it will also take time and effort to avoid them -- if you even want to.

Regardless, I think it's fascinating to learn more about how we think and make decisions every day, so let's take a look at some of these thinking habits we didn't know we had.

1. We surround ourselves with information that matches our beliefs.

We tend to like people who think like us. If we agree with someone's beliefs, we're more likely to be friends with them. While this makes sense, it means that we subconsciously begin to ignore or dismiss anything that threatens our worldviews, since we surround ourselves with people and information that confirm what we already think.

This is called confirmation bias. If you've ever heard of the frequency illusion, this is very similar. The frequency illusion occurs when you buy a new car, and suddenly you see the same car everywhere. Or when a pregnant woman suddenly notices other pregnant women all over the place. It's a passive experience, where our brains seek out information that's related to us, but we believe there's been an actual increase in the frequency of those occurrences.

Confirmation bias is a more active form of the same experience. It happens when we proactively seek out information that confirms our existing beliefs.

Not only do we do this with the information we take in, but we approach our memories this way, as well. In an experiment in 1979 at the University of Minnesota, participants read a story about a woman called Jane who acted extroverted in some situations and introverted in others. When the participants returned a few days later, they were divided into two groups. One group was asked if Jane would be suited to a job as a librarian, and the other group was asked about her having a job as a real-estate agent. The librarian group remembered Jane as being introverted and later said that she would not be suited to a real-estate job. The real-estate group did the exact opposite: They remembered Jane as extroverted and said she would be suited to a real-estate job, and when they were later asked if she would make a good librarian, they said no.

In 2009, a study at Ohio State showed that we will spend 36-percent more time reading an essay if it aligns with our opinions.

Whenever your opinions or beliefs are so intertwined with your self-image you couldn't pull them away without damaging your core concepts of self, you avoid situations which may cause harm to those beliefs.

This trailer for David McRaney's book You Are Now Less Dumb explains this concept really well with a story about how people used to think geese grew on trees (seriously), and how challenging our beliefs on a regular basis is the only way to avoid getting caught up in the confirmation bias:

Professional swimmers don't have perfect bodies because they train extensively. Rather, they are good swimmers because of their physiques. How their bodies are designed is a factor for selection and not the result of their activities.

The "swimmer's body illusion" occurs when we confuse selection factors with results. Another good example is top-performing universities: Are they actually the best schools, or do they choose the best students, who do well regardless of the school's influence? Our mind often plays tricks on us, and that is one of the key ones to be aware of.

What really jumped out at me when researching this section was this particular line from Dobelli's book:

Without this illusion, half of advertising campaigns would not work.

It makes perfect sense when you think about it. If we believed that we were predisposed to be good at certain things (or not), we wouldn't buy into ad campaigns that promised to improve our skills in areas where it's unlikely we'll ever excel.

No matter how much I pay attention to the sunk cost fallacy, I still naturally gravitate toward it.

The term "sunk cost" refers to any cost (not just monetary costs but those in terms of time and effort) that has been paid already and cannot be recovered -- so, basically, a payment of time or money that's gone forever.

The reason we can't ignore the cost, even though it's already been paid, is that we are wired to feel loss far more strongly than gain. Psychologist Daniel Kahneman explains this in his book Thinking Fast and Slow:

Organisms that placed more urgency on avoiding threats than they did on maximizing opportunities were more likely to pass on their genes. So, over time, the prospect of losses has become a more powerful motivator on your behavior than the promise of gains.

Hal Arkes and Catehrine Blumer created an experiment in 1985 which demonstrated your tendency to go fuzzy when sunk costs come along. They asked subjects to assume they had spent $100 on a ticket for a ski trip in Michigan, but soon after found a better ski trip in Wisconsin for $50 and bought a ticket for this trip too. They then asked the people in the study to imagine they learned the two trips overlapped and the tickets couldn't be refunded or resold. Which one do you think they chose, the $100 good vacation, or the $50 great one?

Over half of the people in the study went with the more expensive trip. It may not have promised to be as fun, but the loss seemed greater.

So just like the other mistakes I've explained in this post, the sunk cost fallacy leads us to miss or ignore the logical facts presented to us, and instead make irrational decisions based on our emotions -- without even realizing we're doing so:

The fallacy prevents you from realizing the best choice is to do whatever promises the better experience in the future, not which negates the feeling of loss in the past.

Given that this is such a subconscious reaction, it's hard to avoid this one. Our best bet is to try to separate the current facts we have from anything that happened in the past. For instance, if you buy a movie ticket only to realize the movie is terrible, you could either A) stay and watch the movie, to "get your money's worth," since you've already paid for the ticket (sunk cost fallacy), or B) leave the cinema and use that time to do something you'll actually enjoy. The thing to remember is this: You can't get that investment back. It's gone. Don't let it cloud your judgement in whatever decision you're making in this moment. Let it remain in the past.

4. We incorrectly predict odds.

Imagine you're playing Heads or Tails with a friend. You flip a coin, over and over, each time guessing whether it will turn up heads or tails. You have a 50/50 chance of being right each time.

Now suppose you've flipped the coin five times already and it's turned up heads every time. Surely, surely, the next one will be tails, right? The chances of it being tails must be higher now, right?

Well, no. The chances of tails turning up are 50/50, every time, even if you turned up heads the last 20 times. The odds don't change.

The gambler's fallacy is a glitch in our thinking; once again, we're proven to be illogical creatures. The problem occurs when we place too much weight on past events and confuse our memories with how the world actually works, believing that they will have an effect on future outcomes (or, in the case of Heads or Tails, any weight, since past events make absolutely no difference to the odds).

Unfortunately, gambling addictions in particular are also affected by a similar mistake in thinking: the positive expectation bias. This is when we mistakenly think that eventually our luck has to change for the better. Somehow we find it impossible to accept bad results and give up; we often insist on keeping at it until we get positive results, regardless of what the odds of that happening actually are.

5. We rationalize purchases we don't want.

I'm as guilty of this as anyone. How many times have you gotten home after a shopping trip only to be less than satisfied with your purchase decisions and started rationalizing them to yourself? Maybe you didn't really want it after all, or in hindsight you thought it was too expensive. Or maybe it didn't do what you'd hoped and was actually useless to you.

Regardless, we're pretty good at convincing ourselves that those flashy, useless, badly thought-out purchases are necessary after all. This is known as post-purchase rationalization or buyer's Stockholm syndrome.

Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.

Cognitive dissonance is the discomfort we get when we're trying to hold on to two competing ideas or theories. For instance, if we think of ourselves as being nice to strangers, but then we see someone fall over and don't stop to help them, we would then have conflicting views about ourselves: We are nice to strangers, but we weren't nice to the stranger who fell over. This creates so much discomfort that we have to change our thinking to match our actions -- that is, we start thinking of ourselves as someone who is not nice to strangers, since that's what our actions proved.

So in the case of our impulse shopping trip, we would need to rationalize the purchases until we truly believe we needed to buy those things, so that our thoughts about ourselves line up with our actions (making the purchases).

The tricky thing in avoiding this mistake is that we generally act before we think (which can be one of the most important traits of successful people!), leaving us to rationalize our actions afterwards.

Being aware of this mistake can help us avoid it by predicting it before taking action. For instance, as we're considering a purchase, we often know that we will have to rationalize it to ourselves later. If we can recognize this, perhaps we can avoid it. It's not an easy one to tackle, though!

He illustrates this particular mistake in our thinking superbly, with multiple examples. The anchoring effect essentially works like this: Rather than making a decision based on pure value for investment (time, money, etc.), we factor in comparative value -- that is, how much value an option offers when compared with another option.

Let's look at some examples from Dan, to illustrate this effect in practice.

One example is an experiment that Dan conducted using two kinds of chocolates for sale in a booth: Hershey's Kisses and Lindt Truffles. The Kisses were one penny each, while the Truffles were 15 cents each. Considering the quality differences between the two kinds of chocolates and the normal prices of both items, the Truffles were a great deal, and the majority of visitors to the booth chose the Truffles.

For the next stage of his experiment, Dan offered the same two choices but lowered the prices by 1 cent each. So now the Kisses were free, and the Truffles cost 14 cents each. Of course, the Truffles were even more of a bargain now, but since the Kisses were free, most people chose those instead.

Your loss aversion system is always vigilant, waiting on standby to keep you from giving up more than you can afford to spare, so you calculate the balance between cost and reward whenever possible.

Another example Dan offers in his TED talk is when consumers are given holiday options to choose between. When given a choice of a trip to Rome, all expenses paid, or a similar trip to Paris, the decision is quite hard. Each city comes with its own food, culture and travel experiences that the consumer must choose between.

When a third option is added, however, such as the same Rome trip but without coffee included in the morning, things change. When the consumer sees that they have to pay 2.50 euros for coffee in the third trip option, not only does the original Rome trip suddenly seem superior out of these two, but it also seems superior to the Paris trip, even though they probably hadn't even considered whether coffee was included or not before the third option was added.

Here's an even better example from another of Dan's experiments: Dan found this real ad for subscriptions to The Economist and used it to see how a seemingly useless choice (like Rome without coffee) affects our decisions.

To begin with, there were three choices: Subscribe to the Web version of The Economist for $59, subscribe to the print version for $125, or subscribe to both versions for $125. It's pretty clear what the useless option is here. When Dan gave this form to 100 MIT students and asked them which option they would choose, 84 percent chose the combo deal for $125, 16 percent chose the cheaper, Web-only option, and nobody chose the print-only option for $125.

Next, Dan removed the "useless" print-only option that nobody wanted and tried the experiment with another group of 100 MIT students. This time, the majority chose the cheaper, Web-only version, and the minority chose the combo deal. So even though nobody wanted the bad-value $125 print-only option, it wasn't actually useless; in fact, it actually informed the decisions people made between the two other options by making the combo deal seem more valuable in comparison.

This mistake is called the anchoring effect, because we tend to focus on a particular value and compare it to our other options, seeing the difference between values rather than the value of each option itself.

Eliminating the "useless" options ourselves as we make decisions can help us choose more wisely. On the other hand, Dan says that a big part of the problem comes from simply not knowing our own preferences very well, so perhaps that's the area we should focus on more, instead.

Our memories are highly fallible and plastic, yet we tend to subconsciously favor them over objective facts. The availability heuristic is a good example of this. It works like this:

Suppose you read a page of text and then you're asked whether the page includes more words that end in "-ing" or more words with "n" as the second-to-last letter. Obviously, it would be impossible for there to be more "-ing" words than words with "n" as their penultimate letter. (It took me a while to get that. Read over the sentence again, carefully, if you're not sure why that is). However, words ending in "-ing" are easier to recall than words like "hand," "end," or "and," which have "n" as their second-to-last letter, so we would naturally answer that there are more "-ing" words.

What's happening here is that we are basing our answer regarding probability (i.e., whether it's probable that there are more "-ing" words on the page) on how available relevant examples are (i.e., how easily we can recall them). Our troubles in recalling words with "n" as the second-to-last letter make us think those words don't occur very often, and we subconsciously ignore the obvious facts in front of us.

Although the availability heuristic is a natural process in how we think, two Chicago scholars have explained how wrong it can be:

[R]eliable statistical evidence will outperform the availability heuristic every time.

The lesson here? Whenever possible, look at the facts. Examine the data. Don't base a factual decision on your gut instinct without at least exploring the data objectively first. If we look at the psychology of language in general, we'll find even more evidence that looking at facts first is necessary.

8. We pay more attention to stereotypes than we think.

The funny thing about lots of these thinking mistakes, especially those related to memory, is that they're so ingrained that I had to think long and hard about why they're mistakes at all! This one is a good example; it took me a while to understand how illogical this pattern of thinking is.

The human mind is so wedded to stereotypes and so distracted by vivid descriptions that it will seize upon them, even when they defy logic, rather than upon truly relevant facts.

Here's an example to illustrate the mistake, from researchers Daniel Kahneman and Amos Tversky:

In 1983 Kahneman and Tversky tested how illogical human thinking is by describing the following imaginary person:

Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.

The researchers asked people to read this description, and then asked them to answer this question:

Which alternative is more probable?

Linda is a bank teller.

Linda is a bank teller and is active in the feminist movement.

Here's where it can get a bit tricky to understand (at least, it did for me!): If answer #2 is true, #1 is also true. This means that #2 cannot be the answer to the question regarding probability.

Unfortunately, few of us realize this, because we're so overcome by the more detailed description of #2. Plus, as the earlier quotation pointed out, stereotypes are so deeply ingrained in our minds that we subconsciously apply them to others.

Roughly 85 percent of people chose option #2 as the answer. A simple choice of words can change everything.

Again, we see here how irrational and illogical we can be, even when the facts are seemingly obvious.

I love this quotation from researcher Daniel Kahneman on the differences between economics and psychology:

I was astonished. My economic colleagues worked in the building next door, but I had not appreciated the profound difference between our intellectual worlds. To a psychologist, it is self-evident that people are neither fully rational nor completely selfish, and that their tastes are anything but stable.

Clearly, it's normal for us to be irrational and think illogically, especially when language acts as a limitation to how we think, even though we rarely realize we're doing it. Still, being aware of the pitfalls we often fall into when making decisions can help us to at least recognize them, if not avoid them.

Have you come across any other interesting mistakes we make in the way we think? Let us know in the comments.