Who’s Really Responsible in the Cockpit?

The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of the aircraft.

FAR 91.3

Over the years a recurring question raised about the design of FBW aircraft has been whether pilots constrained by software embedded protection laws really have the authority to do what is necessary to avoid an accident.

But this question falls into the trap of characterising the software as an entity in and of itself. The real question is should the engineer who developed the software, and whose avatar the software is, be the final authority?

An example to illustrate

Say that a passenger aircraft is g limited to prevent overstressing the airframe during a high speed upset, the g limit is set at 2.5g the airframe design limit is known to be 3g and the actual airframe survival (but bent) g limit is nominally 4.5g.

Through no fault of their own the pilots of the so designed aircraft encounter a high speed upset and attempt to recover the aircraft at the 2.5g hard limit. Unfortunately the altitude lost during the encounter is such that the aircraft impacts the ground with the loss of all onboard. The subsequent accident investigation finds that if a manoeuvre of 3.2 g or 0.7g outside the 2.5 g limit had been made the aircraft would have recovered successfully, with some airframe damage.

So the pilots knew what they had to do, but the software prevented them from doing it. Software that was, as it turns out, approved by the chief engineer for flight controls at manufacturer X. Is the chief engineer now responsible for the safety of the crew and passengers? The software certainly isn’t, software after all is insensate. It acts but is not an actor.

But, recounts the chief engineer for flight controls I was given a specification to develop my software solution against. So I cannot be held accountable surely as I did not write the specification, that was the systems engineers responsibility…

So who here has the ethical (and legal) obligation?

To my mind for software, where a precise specification of desired behaviour is required, the systems engineer (or other specifier) must take responsibility for missing, ambiguous or just plain wrong requirements.

In fact my belief is that such a person is as responsible as if he had been personally sitting in the cockpit, for in a very real sense that’s exactly what he’s doing.

4 responses to Who’s Really Responsible in the Cockpit?

I’m a pilot and software developer. I don’t have exeperience of flying heavy commercial aircraft (although have passed the necessary ground studies), nor do I have experience of building safety critical systems. But I feel I can still make the following comments.

Your analysis above misses an important distinction. The software is merely a service (or tool), utilised by a human in an operational context. It is the operator of the service which has the ultimate responsibility. The systems engineer is responsible for the correct functioning of the software with it’s operational context and parameters – as defined by the specification. The engineer cannot possibly forsee the complete set of operational situations the system may operate within.

For example, the FBW system would have a “ground” mode (activated by landing gear microswitches) which disables systems which can only function whilst airborn. The FBW system’s understanding of “on the ground” is defined by the micro-switch activation parameter (and other signals). If the aircraft is jacked up of it’s extended landing gear (but still on the ground) it’s is not the system engineer’s fault the aircraft now deduces it is airborne.

It is the aircraft operator’s responsibility to ensure pilots understand how the FBW system’s specifciation relates to the real-world context.

Regardless, FBW systems revert to alternative/direct law when system inputs appear to stray outside the specification model – at which point human understanding and contextual awareness is required to operate the aircraft.

The problem is as you said, “The engineer cannot possibly forsee the complete set of operational situations the system may operate within.”. But with hard laws thats exactly what the person who defined the software requirements is doing. So flying in normal mode pilots are constrained by a ‘decision’ made elsewhere.

As a result who is finally responsible starts to becomes murky, which I think contravenes the intent of the FAR regulations, to establish a very clear chain of responsibility. The software as avatar metaphor is just my attenpt to highlight this shift in responsibility once we have software making operational decisions, after all they’re called protection laws for a reason!

Another question to ponder is whether the protection laws are there to protect the passengers or the airframe? From a personal safety perspective I’d be happy to allow the pilot to pull enough g’s to save me, even if the pilot bent the aircraft. However the airline (and the manufacturer by implication) cares very much about protecting a multi-million dollar asset so they’d set the g-limit lower, with a hidden trade off against people safety. Is this articulated anywhere? Hmmm, probably not.

I agree with your thinking about the software being insensate and about burden shifting. In fact, I applaud it. We often talk about “man vs machine” but in reality it is “human vs human”. The machine is just a tool.

Having said that, let me interject a different line of thinking. To what extent can complex software be thought of as collective memory or a collection of shared experiences, a function of culture (or subculture). Many complex software systems run into millions of lines of code and no single individual is responsible for it. It is a product of a group, not only a group at a discrete point in time but lessons carried over from the past.

If complex software is thought of as a cultural artifact, what does that imply about responsibility. I think there is a tendency to want to distill responsibility into a ‘mano y mano’ debate. But it really is not a one to one relationship but a one to many relationship.

Hi Daniel, I’m afraid I may have betrayed my cultural biases in that post. My background is defence and aviation in which a large store is placed on accountability. In Australian military airworthiness the sentinel event was the loss of a 707 at sea on a training mission off Sale Victoria, during the early 80’s if memory serves. In the subsequent investigation the ‘powers that be’ found that, because responsibility was so diffused and confused, there was no one to hold accountable (read blame). The solution was the establishment of the airworthiness regulations whose centre piece is a very clear regulation of authority, accountability and responsibility.

But to answer your comment, yes I believe culture plays a huge role in the type of systems we develop, in fact I would advance the argument that it is the significant differences in national and organisational culture that are at the heart of the very different design philosophies of Airbus and Boeing. In my post ‘The Hal Effect’, I explore this theme further (https://msquair.wordpress.com/2009/09/09/the-hal-effect/).

As an Antipodean I’ve been lucky to work with both European and American companies and engineers over the years and observe them from the outside as it were. I’d note that there is definitely a difference in their ‘philosophy of engineering’. Americans tend to be very empirically oriented, while Europeans tend more towards the ‘system theoretic’ approach. I make no comment as to which is the better 😉

With a Bachelor’s in Mechanical Engineering and a Master’s in Systems Engineering, Matthew Squair is a principal consultant with Jacobs Australia. His professional practice is the assurance of safety, software and cyber-security, and he writes, teaches and consults on these subjects. He can be contacted at mattsquair@gmail.com