Human Factors Role in SpaceShipTwo Accident

Human Factors was a central focus in the National Transportation Safety Board (NTSB) public briefings on Virgin Galactic SpaceShipTwo accident that took place on October 31, 2014. SpaceShipTwo operated by Scaled Composites under contract to Virgin Galactic, LLC (VG) and The Spaceship Company, LLC (TSC) broke up 13 seconds into the 4th powered test flight and was destroyed on impact near Koehn Dry Lake, California. The pilot was seriously injured and the copilot was fatally injured; no injuries were incurred on the ground. SpaceShip Two uses a feathering system designed to provide stable reentry into the Earth’s atmosphere upon completion of a sub-orbital spaceflight. As per procedures, it is unlocked by the copilot no earlier than 1.4 Mach and no later than 1.8 Mach. In this case the copilot unlocked the system prematurely at 0.92 Mach which resulted in the break-up. The Executive Summary submitted to the NTSB, by VG and TSC explain how this could happen.

Although normal checklist procedures maintained the feather locks in the locked position until after obtaining a minimum speed of 1.4 Mach, the mishap copilot prematurely unlocked the system at approximately 0.92 Mach. This premature unlocking was indisputably confirmed by telemetric, in-cockpit video and audio data. At this speed, lift from the horizontal tails well exceeded the feather actuator’s ability to prevent a rapid aerodynamic extension of the feather system. These forces caused the feather to rapidly extend without any further pilot action or mechanical malfunction. . . Extension of the feather while in boosted flight under these conditions imparted over 9g’s of pitch up acceleration forces on the spaceship. These forces exceeded SpaceShipTwo’s designed structural load capability and resulted in its in-flight breakup. (pp. 3-4, Executive Summary, May, 2015)

While it is clear that the copilot’s error induced this catastrophic situation, it is not clear why. Thus, the investigation directed much of its focus to answering this human performance question. Interview data brought up during the hearing revealed that the copilot was an exemplary pilot, well-qualified technically and safety conscious. The human performance investigator turned to other possible issues, namely, flight crew training and procedures that contributed to the error.

However, the effectiveness of simulator training varies greatly with respect to level of fidelity. While crews may have practiced normal and non-normal procedures in a full mission scenario, many critical aspects of the operational environment were missing, including vibration and g-forces as well as elements of time pressure (e.g., completing tasks within 26 seconds, the consequences of a mission abort at 1.8 Mach if the feather system was not unlocked). When present, such conditions may result in much greater workload and stress than what would be experienced in a flight simulator.

The lost opportunity to practice is one thing, but failing to acknowledge high levels of workload and stress also influence the development of procedures which often serve as the pilots’ safety net. While the pilot and copilot had clearly defined procedures, many tasks were committed to memory and modified at the last minute, thus introducing additional error potential. Even though interviews revealed that the risk of premature unlocking of the feather system was “common knowledge” this risk was not explicitly called out in the Pilots Handbook or emphasized during training; nor were issues raised during the Flight Readiness Review. What was emphasized was the importance of unlocking the feather system before 1.8 Mach or order to avoid an Abort.

Organizational Safety System

In short, while human error was clearly the proximal cause of the accident, the NTSB also recognized that there should have been organizational support, such as training and procedures that could guard against catastrophic consequences of human-induced errors. Although pilot error was considered in its hazard analysis, it was only in the context of a system failure, that is, if there was a systems failure, the analysis included a consideration of incorrect pilot response. Unfortunately, pilot-induced errors were not analyzed. The NTSB human performance investigator listed the following areas where there was a lack of consideration for human error:

– System not designed with safeguards to prevent unlocking feather

– Manuals/procedures did not have warning about unlocking feather early

– Simulator training did not fully replicate operational environment

– Hazard analysis did not consider pilot-induced hazards

Other Implications

The Investigator-in-Charge presented a list of safety issues raised by this investigation. Some have direct ties to Human Factors:

– Human factors guidance for commercial space operators

– Database for commercial space operators on mishap lessons learned

However, there are implications and lessons learned that go beyond Scaled Composites, Virgin Galactic and The SpaceShip Company. Much discussion took place about the role of the Federal Aviation Administration’s Office of Commercial Space Transportation as well as the Commercial Space industry at large. It is relatively straightforward to guard against this specific accident from recurring. For instance, already a feather lock system with an automatic mechanical inhibit to prevent unlocking or locking the feather locks during safety-critical phases of flight has been designed.

The bigger question is how to anticipate other pilot-induced errors that may occur in other operational environments. We cannot fall back on the “failure of imagination” excuse. Rather the FAA, in its official role and the commercial space community as a whole can use their imagination and take a hard, systematic look at the far ranging effects and consequences of Human Factors in this new endeavor. There already exists Human Factors principles, research and lessons learned for high risk operations that can be readily used and modified for commercial space operations. Human Factors are not only manifest as human errors; they can be used to save the day and prevent an accident from happening.