Tag Archives: analysis

This is another article from Farnam Street, and I confess up until a few days ago, I’d never heard of them. Run by a guy named Shane Parrish, he’s based here in Ottawa. Some really fascinating stuff on there, with decent curation and a lot of links. This article highlights that:

Not all of our grand schemes turn out like we planned. In fact, sometimes things go horribly awry. In this article, we tackle unintended consequences and how to minimize them in our own decision making.

You might think that the article is going to be about train wreck ideas or the butterfly effect causing tsunamis. Not really. In fact, I would say it is more about linear thinking from good intentions to good outcomes, without taking into account side effects. Some unknown, some unforeseeable, some just missed because they stopped thinking early. The article has a great quote from a book by William A. Sherden:

Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally their impacts are imperceptible, at other times colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended.

The conclusion is simple — systems thinking or second-order thinking is needed, but the article doesn’t pay much attention to the fact that often the culprit lies in defining the system too narrowly, when in fact the small system is part of a larger system, and it is the larger system that often has the other effects (like the examples of releasing a predator into a land to control one local population, not realizing that the predator will spread into the larger system). What I do like is the idea that sometimes the failure is in over-estimating the size of the system, assuming there are too many variables, and thus not trying at all to figure out ancillary effects.

Yet, if we know they exist (or in hindsight think we should have), the article explains some of the most common reasons:

Sociologist Robert K. Merton has identified five potential causes of consequences we failed to see:

Our ignorance of the precise manner in which systems work.

Analytical errors or a failure to use Bayesian thinking (not updating our beliefs in light of new information).

Focusing on short-term gain while forgetting long-term consequences.

The requirement for or prohibition of certain actions, despite the potential long-term results.

The creation of self-defeating prophecies (for example, due to worry about inflation, a central bank announces that it will take drastic action, thereby accidentally causing crippling deflation amidst the panic).

However, the article goes even further, adding in over-reliance on models and predictions (mistaking the map for the territory), survivorship bias, the compounding effect of consequences, denial, failure to account for base rates, curiousity, or the tendency to want to do something.

If you’re interested in goals and theory the way I am, then an article about “cross-training for the mind” and different ways of thinking in various disciplines is like catnip. When I saw the article, and that it was going to work through 113 different mental models, I couldn’t NOT click on that bait. In fact, their goal in the article is based on the following:

The overarching goal is to build a powerful “tree” of the mind with strong and deep roots, a massive trunk, and lots of sturdy branches. We use this tree to hang the “leaves” of experience we acquire, directly and vicariously, throughout our lifetimes: the scenarios, decisions, problems, and solutions arising in any human life.

Tendency to Want to Do Something (Fight/Flight, Intervention, Demonstration of Value, etc.)

Microeconomics & Strategy (14)

Opportunity Costs

Creative Destruction

Comparative Advantage

Specialization (Pin Factory)

Seizing the Middle

Trademarks, Patents, and Copyrights

Double-Entry Bookkeeping

Utility (Marginal, Diminishing, Increasing)

Bottlenecks

Prisoner’s Dilemma

Bribery

Arbitrage

Supply and Demand

Scarcity

Military & War (5)

Seeing the Front

Asymmetric Warfare

Two-Front War

Counterinsurgency

Mutually Assured Destruction

The article has lots of links to the models to explain them. It’s like a treasure-trove of mental improvement rabbit-holes. And perhaps the grounds for 113 new blog posts by me as I work through each of them! Mind-blowing.