Tag Archives: National Geographic

I have a great fascination for the aviation industry. I have no expertise in it, but for a layperson like me, it looks impressive from a risk management perspective. The commercial aviation industry is an immensely complex one, with a tremendous scope for a zillion things to go wrong – yet the number of mishaps today, given the scale of things, is relatively small. When an air disaster does occur, a thorough investigation is done and the findings used to improve regulations and safety from that point on. There are examples in this industry that can be emulated by other industries.

The Nat Geo Air Crash Investigation series on television is fascinating to watch. Some of the causes of air disasters in the shows I have watched have been attributed to the following:

Pilot fatigue – after flying many hours in tight schedules, fatigue sets in resulting in irrational decision making

A lack of clarity in communication between air traffic control and air crew and also among air crew

A lack of ability of the co-pilot to influence his or her senior (the captain) when in fact the co-pilot was right

A lack of basic flying skills when the auto-pilot was not working

Confusion with regards to the units used for the numbers presented on the indicators (e.g. metric system or imperial system)

Commercial pressure resulting in decisions to fly despite weather and other conditions not being favourable

A maintenance worker forgetting to switch a control back to “manual” from “auto” after doing routine testing

Less than ideal conditions at airports, e.g. no ground radar, poor lighting and unclear signals

Insects building nests in some aperture, when the aircraft happened to be parked for two weeks, causing malfunction

The last two in the list above are somewhat remote incidents, which would have been difficult to imagine before the investigations were documented. The rest are arguably avoidable through process improvements, training and detailed guidelines and this is what, no doubt, goes into the ongoing feedback loop for safety improvements.

The recent air crash in Nepal in which all 19 people on board were killed (mostly people excited to be going trekking in the Himalayas) is a tragedy. We are told that there have been quite a few crashes in this region. I would be hopeful that if some of the same rigour, discipline and effort that is applied to standard commercial aviation safety is applied to these flights, great strides in safety improvements may be made in the future.

It’s a common (sloppy, I would say) expression that is used these days – I have even heard it used on newschannels such as CNBC. As a physicist, it was rammed hard into me, back in school, that the units are even more important than the numbers. “If I send you to buy eggs, I would rather you return with the wrong number of eggs than the right number of something else”, my physics teacher used to say.

It’s easy to think the meaning is obvious – “in this context it’s obvious it must mean two minutes”, you might say. It becomes habit. Imagine then, talking to someone far away, in another culture, not accustomed to this expression, where traffic jams and a laid back life are the norm – it will not be too difficult to see how the meaning could easily be mistaken to be two hours, two days or even two months!

Malcolm Gladwell, in his book “Outliers” talked about how lapses in communication (because of a lack of clarity and mitigation of the urgency of a message) have resulted in airplane disasters. Another airplane crash investigation I watched in National Geographic’s “Air Crash Investigation” series traced the cause of the crash to be confusion between the the metric and imperial systems of measurements.

We all think in terms of different units – and that’s fine, but why can’t we just be clear about what we mean? How much effort does it take to utter the two syllabus “mi-nutes” or the words for any other units for that matter? I wish we all would make it a good habit to state our units – we really don’t need a huge disaster to happen before we realise how easy it is to avoid some risks.