National Defense provides authoritative, non-partisan coverage of business and technology trends in defense and homeland security. A highly regarded news source for defense professionals in government and industry, National Defense offers insight and analysis on defense programs, policy, business, science and technology. Special reports by expert journalists focus on defense budgets, military tactics, doctrine and strategy.

Faster, Better, Cheaper: Why Not Pick All Three?

4/1/2012
By
Dan Ward

Spend any time with (or as) an engineer, and you will probably hear the phrase “Faster, better, cheaper — pick two.” Sometimes referred to as The Iron Triangle, this supposedly self-evident truism is repeated with great regularity among technologists, program managers and engineers, usually to justify the extensive amount of time and money expended on large, high-tech projects.

Unfortunately, the idea that we can’t make simultaneous improvements in a project’s cost, schedule and performance does not get a lot of critical analysis, even among people who do critical analysis for a living. When presented with the Pick Two concept, technical professionals who would normally insist on reviewing hard data before reaching conclusions inexplicably hear it, believe it and join in the chorus. This idea becomes a self-fulfilling prophecy as project leaders make unnecessary tradeoffs, then conclude such trade-offs were inevitable. Thus is conventional wisdom born.

The funny thing about the Pick Two mantra is that it doesn’t hold up to scrutiny. The even funnier thing is that it gets so little scrutiny in the first place. For example, many people write off NASA’s experiment with faster-better-cheaper in the 1990s as if it was an embarrassing flop, but when pressed, precious few can say what exactly NASA attempted or accomplished under that banner. To help remedy that, let’s look at the numbers.

According to Howard McCurdy’s book “Faster, Better, Cheaper,” NASA launched 16 major missions between 1992 and 1999. Far from backyard science projects, these missions were some of the most challenging things NASA ever attempted, including missions to Mars, to the moon, several Earth-orbiting satellites and even an asteroid rendezvous.

Were these missions any good? Well, the Near Earth Asteroid Rendezvous (NEAR) project travelled 2 billion miles, intercepted the asteroid Eros, collected 10 times more data than anticipated, then glided to a smooth landing on Eros’ surface despite not being designed as a lander — the first time such a maneuver had ever been attempted. I would call that a win.

Similarly, the Pathfinder mission to Mars was designed to last less than one month, but it went on for three months, collected 17,000 images and was one of NASA’s proudest moments of the decade. It is worth noting that Mars is fiendishly difficult to visit. Despite making 19 attempts, the Russians never reached the Red Planet. Not only did the Pathfinder team put cutting-edge hardware on Mars, they did it faster, better and cheaper than the 1970s Viking mission.

By 1998, nine of the first 10 missions had succeeded wildly. So much for Pick Two. Interestingly, when you add up the cost for all 16 missions, the total is less than the amount spent on the traditionally managed Cassini mission to Saturn. Yes, that’s 16 missions for the price of one.

Let’s take a closer look at the program. The NEAR spacecraft launched a mere 27 months after the project was funded, and it cost less than two-thirds of the original estimate ($122 million instead of $200 million). Pathfinder came in at 1/15th the cost of the 1970s Viking mission to Mars (in constant-year dollars) and was built by one-third the people in half the time. The other FBC projects had similar tallies.

FBC wasn’t without challenges. After seven amazing years, things went south in 1999, when four out of five missions crashed and burned, sometimes literally and always prominently. At the end of the day, only 10 out of 16 achieved their objectives. This 62 percent success rate was deemed too low and NASA moved away from FBC, despite a 2001 Inspector General report which recommended NASA “fully incorporate FBC into the strategic management process.” This was clearly an endorsement of the method, not a rejection. There was no evidence of a need to pick two.

On one hand, six fails out of 16 sounds like a lot of failure, but this calculation is not the most meaningful analysis and may even be misleading. Upon reflection, we find there is no limit to the number of attempts we can make. The only limiting factor is how much time and money we can spend.

Therefore, it makes more sense to calculate outcomes per dollar instead of per attempt. Doing the math this way, we find that during a seven-year period, NASA delivered 10 successful missions (and six failures) for less than the price of one.

Ten for one is a pretty sweet return on investment. While the per-attempt calculation looks like a lot of failure, the per-dollar calculation shows FBC actually delivered an order of magnitude more success than the traditional approach. The fact that there were six failed projects is irrelevant because their costs were included in the total price tag. More importantly, we cannot conclude that FBC failed just because some projects did. Perfection was never in the cards.

Speaking of failure, what caused those six programs to flop? According to NASA’s FBC task force final report, most failures came from “poor communication and mistakes in engineering and management.” These issues are neither unique nor ubiquitous to the faster, better, cheaper approach, and are hardly reason to jettison the whole concept. They certainly would not have been avoided if the project managers had picked two and decided to either be slower, worse or more expensive.

It must be acknowledged that during the same timeframe, other organizations made less successful attempts to copy faster, better, cheaper. Not surprisingly, treating it as a PowerPoint slogan was ineffective. The same goes for using FBC as an excuse to downsize or take unwarranted shortcuts with safety, testing or quality. However, treating it as a disciplined, deep practice — as NASA did for seven remarkable years — provided excellent results. By constraining complexity while emphasizing speed and thrift, America’s space agency hit their target over and over again. Interested readers can check out 99 Rules For Managing FBC Projects by Alexander Laufer and Edward Hoffman for specific tips on NASA’s practices.

The data from NASA’s portfolio clearly show expensive complexity and endless delays are not inevitable. These missions boldly proved it is possible to simultaneously improve the cost, schedule and performance of high-tech projects — no need to pick two.

This brings us to today and the financial challenges facing the U.S. military. On Feb. 6, in a speech at the Center for Strategic and International Studies, Frank Kendall commented on acquisition reform efforts, saying “We tend to retry things every 10 years or so because we don’t remember what happened the last time they were tried ... because we don’t have any data.” He’s absolutely correct, of course, but in the case of faster, better, cheaper, things are a little reversed.

If we’re not retrying FBC, it’s because we incorrectly think we do remember what happened, focusing on one bad year and neglecting the seven good years that came before. Fortunately, we have plenty of compelling, accessible data about what really happened, if we will just look at it.

So, before we start making unfortunate and unnecessary tradeoffs, sacrificing speed and performance in the name of thrift, maybe we should reevaluate what happened at NASA in the 1990s. Maybe we should look again at the data. What the data tells me is this: FBC worked, and it’s worth another try.

Lt. Col. Dan Ward is an active duty officer in the U.S. Air Force, currently deployed to Afghanistan. The views expressed in this article are solely those of the author and do not reflect the official policy or position of the U.S. Air Force.

Related Events

Comments (0)

Name *

Email *

Comment *

Please enter the text displayed in the image.

Characters *

Legal Notice *

NDIA is not responsible for screening, policing, editing, or monitoring your or another user's postings and encourages all of its users to use reasonable discretion and caution in evaluating or reviewing any posting. Moreover, and except as provided below with respect to NDIA's right and ability to delete or remove a posting (or any part thereof), NDIA does not endorse, oppose, or edit any opinion or information provided by you or another user and does not make any representation with respect to, nor does it endorse the accuracy, completeness, timeliness, or reliability of any advice, opinion, statement, or other material displayed, uploaded, or distributed by you or any other user. Nevertheless, NDIA reserves the right to delete or take other action with respect to postings (or parts thereof) that NDIA believes in good faith violate this Legal Notice and/or are potentially harmful or unlawful. If you violate this Legal Notice, NDIA may, in its sole discretion, delete the unacceptable content from your posting, remove or delete the posting in its entirety, issue you a warning, and/or terminate your use of the NDIA site. Moreover, it is a policy of NDIA to take appropriate actions under the Digital Millennium Copyright Act and other applicable intellectual property laws. If you become aware of postings that violate these rules regarding acceptable behavior or content, you may contact NDIA at 703.522.1820.