We demand that leaders be decisive, but research in social psychology and behavioral economics suggests that decisiveness is not an unequivocal good. Studies on “mindset” reveal that, when contemplating an important decision, prematurely focusing on execution can exacerbate decision-making biases and lead to overconfidence and excessive risk-taking.

A wealth of research suggests that we can, in fact, become too decisive. Learn the pitfalls of an execution-focused mentality, and how to avoid them.

After nearly a decade leading Roman legions on a conquest through Gaul, Julius Caesar had to make a high-stakes decision. Either he could obey the Senate by disbanding his army and returning to Rome, or he could retain control of his victorious armies without imperium—the elected right to govern—an act of treason that would make war inevitable. He deliberated as he marched his legion south through Gaul toward the Rubicon River, which marked the northern border of the Roman republic. He reached the river in the early days of 49 BCE, and his decision to lead an army across the Rubicon into Rome marked a moment of commitment for Caesar; for Rome, it marked the death of the republic and the birth of an empire.1

Whether building empires or businesses, we’ve all faced conundrums that lack an easy answer.

A chief financial officer, for example, plays an increasingly central and multi-faceted role in many businesses, acting as caretaker of the company’s financial well-being; manager of its finance organization; a strategic adviser; and a driver of change.2 Accordingly, most CFOs confront multiple decision points every day. Some, such as fixing a meeting agenda, may border on the pedestrian. Others, like launching a new product, pursuing a merger, choosing an alternative supplier for a key component, or filling critical personnel vacancies, can have a material impact on a company’s fortunes; indeed, the fate of the business may hinge on a single decision. Compounding the challenge, many CFOs face external or self-imposed pressure to quickly and definitively reach a conclusion.

And no wonder. We are inundated with breathless proclamations about the unprecedented pace of modern business and ever-increasing competitive intensity. The implication, of course, is that by waiting a week (or a day, or an hour) before choosing a course of action, an opportunity might be missed, to the perpetual detriment of the fence-sitter. Serious business writing touts the importance of “decisiveness” for effective leadership.3 Classes profess to make you decisive, for both personal and professional gain. We are told that “the most important quality a leader can have is decisiveness,” and “indecisive” is among the worst epithets that can be levied against an executive.4

“Indecisive” is among the worst epithets that can be levied against an executive. However, a rich body of research in social psychology and behavioral economics suggests that decisiveness is not an unequivocal good.

However, a rich body of research in social psychology and behavioral economics suggests that decisiveness is not an unequivocal good. Studies on “mindset” reveal that, when contemplating an important decision, prematurely focusing on execution can exacerbate decision-making biases and lead to overconfidence and excessive risk-taking. This article offers suggestions for mitigating the pitfalls of execution-oriented mindsets—without falling victim to “analysis paralysis.”

The theory of action phases, pioneered in the 1980s by Peter Gollwitzer and Heinz Heckhausen, suggests that individuals move through distinct states of mind in the time leading up to and following a decision.5 Each mindset is tuned to a specific task, so switching from one to the next changes the way people receive, process, and act on information.

During the pre-decision period, individuals adopt a deliberative mindset. Here, people focus on adjudicating potential goals. They weigh information about the likelihood and value of different outcomes. Eventually, “the die is cast” and in the post-decision stage, an implemental mindset takes over (see figure 1). Think Caesar crossing the Rubicon into Italy. This is the time when a firm commitment culminates in a course of action and a new mindset. People stop deliberating, and thoughts turn to how to execute the decision.

To see how this might work in the real world, consider a company exploring a major acquisition. In a deliberative mindset, the CFO and her team might look at many potential targets, considering their capabilities, corporate cultures, and prices. Once the said acquisition is complete, an implemental mindset adapts to the task of making the decision a reality by, for example, making announcements to shareholders and integrating employees.

Similarly, when facing a key hiring decision, a deliberative mindset weighs how all of the candidates’ experiences and skills stack up and evaluates their potential fit with current team members. Once a candidate is decided upon, the accompanying implemental mindset kicks in and focuses on issues such as salary negotiations, onboarding, and the specific duties assigned to the new hire.

Naturally, the switch from a deliberative to implemental mindset must occur at some point. Unfortunately, many individuals and groups cross the mental Rubicon prematurely. People might “trust their gut” on an issue and jump to a quick decision of their own volition, driven perhaps by a desire to appear decisive. Alternatively, decision points may be imposed externally, based on deadlines set elsewhere in an organization. Similarly, the decision threshold can be crossed when an individual concludes, for whatever reason, that a particular outcome is inevitable. Even prudent steps, such as “thinking ahead” about how a potential decision would be executed, can prompt a shift to an implemental mindset—often without an individual even being completely aware of the change in outlook.

Herein lies the danger. Even if a decision seems correct at the time it was made, new facts may arise, warranting reconsideration. However, the implemental mindsets we adopt to help us achieve our chosen goals can exacerbate a host of judgmental and decision-making biases. An execution-oriented frame of mind may encourage “tunnel vision” and lead to overconfidence and excessive risk taking. In the end, individuals may stick to decisions that no longer make sense, with potentially disastrous consequences.

Do mindsets play a role in every decision?

Cereal or a bagel? The red tie, or the blue one? Do we really go through an extensive “pros versus cons” deliberative mindset before making these choices, and then carefully plan out the steps required to take a bowl out of the cupboard and the milk out of the refrigerator? Clearly, taken to an extreme, the model of action phases would suggest an extreme degree of calculation and forethought.

Thankfully, the model makes no such claims. Instead, there are several criteria for the Rubicon framework to come into play. First, the decision must involve a “goal,” something the commonplace decisions of daily life fail to rise to. Second, many actions are simply resumptions of a previous decision, so careful weighing of pros and cons is unnecessary. Finally, intense planning for execution only applies to actions where achieving the goal is potentially in doubt. We don’t need to strategize about how to brush our teeth in the morning; we just do it.7

That said, researchers in lab settings have demonstrated the ability to shift individuals into deliberative and implemental mindsets with remarkable ease, often by simply asking participants to think about a pending but unresolved decision (deliberative) or the steps required to execute a recent choice (implemental). It seems safe to conclude that “action phase” mindsets—and their attendant biases—are at work in most decisions that are important, unprecedented, and difficult—in other words, nearly all of the business decisions we might care about.

The downsides of decisiveness

Disregarding new information and making biased inferences

Preoccupied with a task, individuals in an implemental mindset often downplay or ignore new information in general, and new information about previous choice in particular. This closed-mindedness can manifest, for example, in poorer short-term memory. In lab experiments, subjects in a deliberative mindset have better memory recall than those in an implemental mindset when asked to repeat a string of nouns unrelated to their decision.8 In a separate study, participants were presented with information about the pros and cons of an action and, simultaneously, the steps required to execute it. They were then asked to recall the information that was presented. Those in an implemental mindset recalled a greater proportion of the execution-focused information and less of the pros-and-cons information than those in a deliberative mindset. Underscoring the power of these mindsets, the information presented to participants was unrelated to their own decision; merely the structure of the information (whether about the desirability of some goal or how to achieve it) was sufficient to activate the bias.9

What’s worse, the information that is received is processed in a way that reinforces the already-made decision. When asked to consider the advantages and disadvantages of a course of action, those in an implemental mindset report few, if any, downsides. In contrast, those with a deliberative mindset are relatively balanced between positive and negative implications.10 As you might guess, this bias is particularly acute for information that contradicts or questions a chosen course of action.11

The implemental mindset encourages the onset of confirmation bias—the tendency to gather and rely on information that confirms what we already believe, and to avoid or discount data that may contradict our pre-existing positions. In short, we see what we want to see. Confirmation bias can be seen in everything from downplaying of the threat of attack before Pearl Harbor to how investors determine the composition of their portfolios.12

The implemental mindset encourages the onset of confirmation bias—the tendency to gather and rely on information that confirms what we already believe, and to avoid or discount data that may contradict our pre-existing positions.

The closed-mindedness associated with an implemental mindset extends beyond just receiving new information. It also affects how strongly individuals hold a particular position, their willingness to entertain ambivalence or ambiguity about a decision, and retrospective evaluations of foregone choices. Experiments have demonstrated that individuals in implemental mindsets tend to adopt more extreme and rigid positions on issues—even those unrelated to a particular decision—due to a one-sided processing of information.13 Ironically, continued group deliberation may not be the solution, since discussions among like-minded individuals, such as those who have (at least internally) reached a similar decision, can actually lead to even more polarized views.14 Furthermore, studies have shown that, after choosing between two closely evaluated options, participants in an implemental mindset tend to see the unselected option much less favorably, in a phenomenon known as “spreading of alternatives.”15

The illusion of control

Individuals in an implemental state of mind are also more likely to overestimate the degree of influence they have over outcomes. In a series of trials using lab equipment, researchers asked participants to estimate how much control they could exercise over whether a light turned on by pressing a button. In reality, participants had no control (researchers controlled the light), but those who were in an implemental mindset believed they had a significantly greater degree of influence than others.16In addition, individuals in a deliberative mindset were even less likely than the control group to fall victim to an illusory sense of control, underscoring the benefits of a deliberative mindset when weighing the feasibility of alternative goals.

The findings for both mindsets—greater perceived control for implemental, less perceived control for deliberative—are exacerbated by the reported importance of the decision. Post-decision, the higher the stakes, the more control we think we have over the outcome.

Excessive optimism

In one of the earliest demonstrations of the power of pre- and post-decision mindsets, researchers in 1968 asked gamblers at a racetrack to evaluate their horse’s odds of winning. Those who were asked immediately after making their bet were significantly more optimistic than those who were asked just before the bet was placed.17 Subsequent studies found similar effects when evaluating one’s own prospects. For example, participants were asked to choose among test materials and were either interrupted before deciding (deliberative mindset) or after they had decided (implemental). They were then allowed to choose the difficulty of the test they would ultimately take, and to predict their own performance. Those in an implemental mindset chose more challenging tests and were more optimistic about their prospects, despite the fact that both groups ultimately performed the same.18 In short, the very act of making a decision affects our outlook on the likelihood of success.

All individuals are susceptible to illusions of invulnerability, but those in implemental mindsets are particularly prone to downplaying risk.19 For example, one study placed participants in a deliberative or implemental mindset by having them ruminate on a significant life decision or on the steps required to execute a recent choice, respectively. They were then asked to evaluate their personal vulnerability and the average college student’s vulnerability to an array of controllable risks (such as developing a drug addiction or getting divorced) and uncontrollable risks (such as losing a limb or having a partner die young). Participants in an implemental mindset saw themselves as significantly less vulnerable to risk than either the deliberative or control groups, and they judged themselves to be particularly immune to controllable risks.20

Business leaders, even those we might expect to be the most fact-driven and sober, are no less vulnerable to overconfidence than others. Using massive, multi-year surveys of CFOs, researchers found that respondents significantly underestimated the volatility of an overall stock index and the share performance of their own company.21 During the sample period, about 33 percent of the stock index’s actual returns fell within CFOs’ 80 percent confidence estimates (had their estimates been accurate, 80 percent of returns would have fallen within that 80 percent interval). Those who erred in their market forecasts were similarly off when estimating the ROI for internal projects. Just as troubling, the more optimistic the CFO, the more their company invests (as measured by net investments). Similarly, overly optimistic CEOs are more likely to issue voluntary earnings forecasts, and those forecasts are themselves more likely to be unrealistically optimistic.22

The dangers of decisiveness

While a decision-making bias may not result in a bad decision, real-world evidence suggests that poor decisions are often—too often—the byproduct of prematurely switching to an implemental mindset.

On February 1, 2003, the space shuttle Columbia broke apart as it reentered Earth’s atmosphere. All seven crew members died. The proximate cause of the disaster was a piece of insulating foam that broke from a fuel tank during liftoff, impacting the shuttle’s wing and causing catastrophic failure upon reentry. Implemental mindsets, coupled with other decision-making and organizational problems, may have served as contributing causes of this tragedy. For example, soon after launch, NASA officials became aware that debris had impacted the shuttle’s wing during launch. Working-level engineers voiced concerns about the potential damage to the shuttle. Senior managers—who were more directly responsible for the decision to launch—reportedly downplayed the risk and were sanguine about the consequences of proceeding with the mission. In the view of engineers who attended an important briefing, “Management focused on the answer—that analysis proved there was no safety-of-flight issue—rather than concerns about the large uncertainties that may have undermined the analysis that provided that answer.”23 In other words, they interpreted new information in a way that likely reinforced their chosen decision—to proceed with the shuttle mission as planned.

Business leaders, even those we might expect to be the most fact-driven and sober, are no less vulnerable to overconfidence than others.

The consequences of such biases for CFOs may be less dire, but they are no less real. These biases contribute to underperforming deals, cost overruns, and failed product launches. A study of over 100 M&A transactions whose express purpose was enhanced growth, found that nearly three-fourths were “low-performing,” in that the combined firms’ growth was equal to or less than what the companies could have expected had they remained independent.24 Just 6 percent of large software projects—the kind likely to require CFO sign-off—come in on-time, on-budget, and with a successful implementation.25 The failure rate for new products ranges between 30 and 50 percent. As executives at a leading marketing firm noted, “[N]ew products can take on a life of their own within an organization, becoming so hyped that there’s no turning back.”26

These shortfalls can be amplified by other biases likely to be at work. For example, if a decision-maker already believes in the merits of a particular course of action, the previously mentioned confirmation bias can skew how new information is interpreted. Consider a classic study examining attitude changes among proponents and opponents of the death penalty. Researchers presented these individuals with two studies, one appearing to demonstrate the efficacy of the death penalty as a deterrent and the other offering evidence to the contrary. After reading both studies, individuals tended to exhibit attitude polarization. People’s initial views strengthened despite examining balanced evidence on both sides of the issue. They focused on the data that confirmed their pre-existing positions.27

Similarly, individuals may exhibit sunk cost bias as they move further down a decision path. In this trap, people find themselves throwing good money after bad and escalating their commitment to courses of action in which they have made substantial prior investments of time, effort, and money. Consider the momentum that sometimes builds as a CFO and other leaders spend an extensive period of time analyzing a potential acquisition. The implemental mindset may take hold as managers value synergies, conduct due diligence, and discuss acquisition integration. They can’t back away from the deal even if new risks emerge, since they have invested so much time analyzing and preparing for it.

The mindsets we adopt before and after making decisions are built for very different tasks. Pre-decision, our thoughts are oriented toward evaluating options and choosing a path based on feasibility and desirability. We tend to weigh information relatively objectively and are open to new facts about the choices before us. Post-decision, deliberation is often set aside and we focus intently on the specifics of execution. New information is ignored or skewed, we may see our current course of action in a more favorable light, and we downplay the attendant risks. We often overestimate our chances of success and the degree of our own influence. If we have crossed the mental Rubicon prematurely, these biases can create a path dependence that can leave outside observers scratching their heads.

Ignoring new information, foreclosing alternative courses of action, and succumbing to the “illusion of control” can lead decision makers to pursue excessively risky plans or miss opportunities to short-circuit a looming crisis.

Permanent deliberation also has its drawbacks, and implemental mindsets are well-suited to the tasks for which they were created. Indeed, some research suggests that those in an implemental mindset are more persistent and execute tasks better than those caught in a lengthy deliberation mode.28 They can also help mitigate loafing: People in an implemental mindset worked just as hard under conditions of collective performance evaluation (for example, groups, rather than individuals, are rated) as under individual performance evaluation, in contrast to those in a “neutral” mindset. That finding suggests there may be some advantages to promoting an implemental state of mind in employees.29

That said, for business leaders in general and CFOs in particular, prematurely switching to an implemental mindset seems the greater danger. Ignoring new information, foreclosing alternative courses of action, and succumbing to the “illusion of control” can lead decision makers to pursue excessively risky plans or miss opportunities to short-circuit a looming crisis. By being cognizant of these competing mindsets, executives can actively work to avoid the biases of the implemental state.

Being deliberate about decisiveness

Thankfully, research suggests that mindsets are an “active” phenomenon, in that we can consciously control them. Deliberative and implemental mindsets can be proactively triggered with relatively little effort.30 In many lab experiments, for example, mindsets are cued by simply asking subjects to think about the pros and cons of an impending choice (deliberative) or how to execute a decision they’ve recently made (implemental).

Accordingly, one way to successful implementation is adequate deliberation. To this end, executives should consider the following tactics before letting their mindset cross the Rubicon from deliberative to implemental:

Make sure you do have multiple options.

Put the decision point on a timeline or calendar, and before that deadline, focus on deepening understanding of the issue at hand, rather than on the “go/no-go” decision. On the eve of another space shuttle disaster, the 1986 explosion of the Challenger, engineers and NASA administrators were narrowly concentrated on a decision to launch, where a broader inquiry into the alternative reasons why O-ring erosion might be occurring may have revealed the dangers ahead.31

Conduct a “premortem” of the available choices.32 Imagine that the decision, months or years down the line, has failed dramatically, and ask yourself and your team what contributed to the failure.

Consider using structured analytic techniques, such as a “devil’s advocate” or “dialectical inquiry.” In the latter method, multiple subgroups present the advantages of different alternatives, and then critique each other’s proposals. In the former, several people assume responsibility for offering a constructive critique and perhaps kindle additional options.33

Keep the project management tools on ice until you’ve committed to executing a project.

Pay attention to discordant data. Charles Darwin collected a great deal of data about this emerging theory of evolution, but he also kept a separate notebook containing observations that contradicted his hypotheses.34

Institute a “cooling-off” period before moving from deliberation to implementation.

Consider bringing unbiased outsiders into the conversation rather than simply assembling domain experts to discuss the decision.

You should not simply discuss what you will do, and then analyze how to do it. Decision-making processes should not unfold in such a linear fashion from analysis to action. You should consider implementation obstacles and risks as you make decisions. Moreover, you should encourage the team to step back occasionally from the “how should we execute this plan” discussion to consider whether the plan itself is the proper course of action. By implementing these techniques, CFOs can improve the likelihood that their decisions yield favorable results.

Lynne Guey, “Instant MBA: The most important trait a leader can have is decisiveness,” Business Insider, June 7, 2013, www.businessinsider.com/most-important-trait-a-leader-can-have-is-decisiveness-2013-6. View in article

The same result held for capital punishment opponents. Charles G. Lord, Lee Ross, and Mark R. Lepper, “Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence,” Journal of personality and social psychology 37.11 (1979): 2098. View in article

Peter Gollwitzer and Ute Bayer, “Becoming a better person without changing the self,” paper presented at the Self and Identity Pre-Conference of the Annual Meeting of the Society of Experimental Social Psychology, Atlanta, GA, 2000. View in article

Gollwitzer, 1990. Indeed, research suggests that tasks can be “reframed” even after a decision has been made and implementation has begun, and that shifting from a “performance” frame focused on process and outcomes to a “learning” frame focused on exploration is associated with improved results. See Amy C. Edmondson, “Framing for learning: Lessons in successful technology implementation,” California Management Review 45, no. 2 (2003). View in article