The 1978 Burnside Airplane Crash

When a jetliner plummeted into East Portland, an industry changed.

It was winter-dark, but 14-year-old Kevin-Michael Moore hopped on his Schwinn and rode into the streets of Rockwood. His dad had just called from his job at Portland International Airport to say that United Flight 173, inbound from Denver, had reported landing-gear problems. Moore, who knew the flight patterns over his neighborhood, hoped to see the plane. It was December 28, 1978.

UAL 173 had already circled for nearly an hour after the right-side landing gear deployed with a double ker-wham, so hard it rocked the aircraft. The impact shorted out the “down and locked” indicator lights in the cockpit. Captain Malburn “Buddy” McBroom feared the wheel was swinging free and would collapse on touchdown.

Now time (and fuel) ran out. As the plane approached PDX, all four engines died. The first officer radioed a mayday, saying, “We’re going down. We’re not going to make the airport.” McBroom aimed the jet for a dark patch among the house lights below, hoping it was a city park.

Moore pedaled south. Christmas lights still twinkled against the wet pavement. He neared a wooded lot at East 157th and Burnside, and he saw it: the DC-8, above him like a “three-story building,” eerily soundless before it tore into treetops and power lines, then smashed to earth 200 yards away from him.

Ten people died, including the flight engineer and the lead attendant. Twenty-three suffered severe injuries. But 179 passengers and crew aboard survived—and, more astonishingly, the families who had lived in the two houses the plane crushed had both moved out just days before.

The crash’s ramifications, however, went far beyond those directly involved. What happened in the cockpit—or, more precisely, what didn’t happen—changed the American airline industry, and more. Flight 173 made clear the need for a better way to make decisions in a crisis. More than that, it now offers a sidelong glimpse of how such calamities can subtly reshape the systems that run the world.

In the weeks following the crash, one persistent question emerged: how could a well-supplied commercial flight simply run out of gas? The National Transportation Safety Board placed blame on McBroom, saying the pilot became so engrossed in the landing-gear problem and emergency landing preparations that he failed to act on the dire fuel situation.

But the board also reached a broader conclusion about how the crew communicated during the crisis. Voice recordings showed that crew members tried to warn McBroom about the fuel situation. But the NTSB report said they lacked “assertiveness.” Cockpit culture left little room to challenge a captain’s decisions. That was about to change.

Tom Cordell, a retired United Airlines pilot active in the era, remembers the aftermath. “In those days a lot of pilots came out of the military and were very independent, even tyrannical,” he says. Indeed, a half hour before the crash, McBroom casually radioed in a “ballpark” landing time to PDX air traffic control as the crew prepared the cabin for a rough landing: “I’m not gonna hurry the girls.... It’s clear as a bell and no problem.”

After UAL 173, Cordell says, United summoned flight crews to its Denver headquarters for retraining in Crew Resource Management (CRM) techniques, a new program initially developed by NASA to reduce human error by emphasizing participatory decision-making.

“The first thing they made us do was watch 12 Angry Men,” Cordell recalls. In the 1957 film, directed by Sidney Lumet, a lone juror raises questions casting doubt on a seemingly open-and-shut murder case, until it becomes clear the accused is innocent. Cordell says the lesson was, “Keep talking if something is bothering you.”

Other airlines quickly followed United’s lead. Soon, the ideas spread into other high-stakes settings. After a scathing 1999 report on hospital errors, medical staffs throughout the US began learning CRM-style problem-solving. Firefighters jumped on board after a 2004 book recommended CRM as a “force multiplier” for successful decision-making in emergencies.

Mistakes still happen, of course. But many observers agree that the practices adopted by airlines after UAL 173 have gradually improved teamwork and safety in many high-risk professions.

McBroom understandably maintained he had done what he could. “I was very busy trying to fly an airplane in trouble,” he said. But the changes that followed that December night proved a larger point: in a crisis, even the most experienced leaders are only as good as their teams.