Think You’re Making Good Decisions? Think again!

Suppose I said to you “Sue’s got a bug”. Quickly now…what do you think Sue has? If you’re a programmer, you probably think Sue has a computer virus. But if you’re a doctor, perhaps the flu comes to mind. And if you’re an entomologist, a ladybug may be your first thought. Did you consider all three as possible outcomes? Probably not. And what about some others? If you’re a spy, you might think Sue found a listening device. If you sell cars, you might think Sue bought a Volkswagen beetle. The list goes on and on.

So why didn’t all of these come to your mind? Well first off, I asked you to respond quickly, which reduced the time you spent thinking about it. And second, you based your response on your intuition, instinct, or experience. You responded reflexively. This is inherently how we make most decisions every day. Do you know how much fat and calories are in that Sausage McMuffin you ordered? Did you review the economic fundamentals before acting on a friend’s stock tip? Did you read the TripAdvisor reviews that mentioned bedbugs in that hotel you booked? The answers to all of these are probably “no”. We neither have the time nor stamina to properly frame each decision in terms of uncertainty and risk.

The same is true in our working lives. However, the difference is that we’re paid to make good decisions in our jobs, and those decisions often involve millions of shareholder dollars. In these situations, we can’t afford to think reflexively. Instead we need to think reflectively, which requires deliberate time and effort.

A staged approach, which focuses on determining what project stage you’re in, the key risks and uncertainties associated with that stage, and what data gathering and analyses you want to undertake to make a good decision about whether to move to the next stage.

Probabilistic thinking, which requires that we quantify the range of possible outcomes and assign a degree of confidence to any given outcome. This is much better than providing a single deterministic value as the most-likely case, because this is rarely (if ever) the actual outcome.

Asking the right questions, which means decision-makers need to probe 1) the work used to justify the recommendation, 2) whether the base case could be pessimistic/optimistic, and 3) whether credible alternatives were considered.

This sounds straightforward enough, but companies struggle to implement and apply these processes to their decision-making on a consistent basis. New management teams want to reorganize the way things are done. Staff turnover erodes the memory of what worked and what didn’t. Teams have turf to defend and walls to build. All of these contribute to lapsing into reflexive thinking.

“So what”, you say. “Let’s be bold and use our gut to guide us”. Could this be a successful strategy? Occasionally it does work, which provides memorable wildcatter stories (consider Dad Joiner). But given that oil and gas companies are in the repeated trials business, you’ll eventually succumb to the law of averages. For example, if we look at shale plays in the U.S., only about 20% of these have been commercially successful. You might get lucky by drilling a series of early horizontal wells in a shale play, but it’s more likely that you’ll squander millions of dollars you didn’t need to spend in order to realize that the play doesn’t work. In this sense, we’re like Alaskan bush pilots. There are old bush pilots and bold bush pilots. There are no old and bold bush pilots. If you want longevity, you need discipline.

Recently, we’ve begun to understand more about how people make decisions with their gut. It turns out that these reflexive decisions are very likely to be affected by cognitive bias. These are errors in thinking whereby interpretations and judgments are drawn in an illogical fashion. Some definitions and examples of these in the oil and gas industry are listed below:

Anchoring: attaching an evaluation to a reference value. Example: focusing on one geological model or a favored seismic interpretation.

Availability: overestimating the likelihood of events that are more memorable. Example: the recent well drilled by an offset operator with a huge initial production rate.

Confirmation: interpreting data in a way that confirms our beliefs. Example: collecting data in the most prospective area and extending this interpretation elsewhere.

Framing: reacting to a particular choice depending on how it is presented. Example: only comparing your opportunity to successful analogs.

Information: having a distorted assessment of information and its significance. Example: equating missing or low quality data with a low or high chance of success.

Overconfidence: overestimating the accuracy of one’s own interpretation or ability. Example: generating a narrow range of resource estimates.

Motivational: taking actions or decisions based on a desire for a particular outcome. Example: Overstating the chance of success or size of the prize in order to get a project funded.

So if you’re going to make decisions “with your gut”, at least realize the types of cognitive bias that could impact your decisions, and take some steps to lessen their impact on your exploration risk analysis, resource play evaluation, or production type curve generation.

With this in mind, we’ve come up with a new 2-day course at Rose and Associates called “Mitigating Bias, Blindness, and Illusion in E&P Decision-Making”. This course, in concert with our portfolio of courses, consulting, and software designed to help you think more reflectively about your project, is aimed at helping you make better decisions. Check out our offerings.