We’re currently experiencing serious technical problems on the site, and as a result are unable to update the news – even though our market data is running as per normal. We sincerely apologise for any inconvenience caused and hope to be up and running again this evening. Thank you for your patience in this regard.
– David McKay (editor) & team

In the days after a Boeing Co. 737 Max 8 jet plunged into Indonesia’s Java Sea last October, company officials said they were moving quickly to update plane software suspected in the crash.

Six months and a second Max 8 disaster later, Boeing has yet to submit its fix to regulators. Last week, pilots and its airline customers left a Federal Aviation Administration meeting with no idea when the grounded model would fly again. “We’ve taken off our watches and put the calendars in the drawer,” American Airlines pilot Dennis Tajer said after the meeting.

Fixing software, it turns out, is no easy task. “Any time you change software code, it’s a major issue,’’ said Clint Balog, an Embry-Riddle Aeronautical University professor who studies the interaction between humans and computers in planes. “If you change even one small thing in a code, it can have downstream implications.’’

In a video message Wednesday night, Boeing CEO Dennis Muilenburg said the company had finished its last test flight and was prepared to move forward with certification. The goal, he said, is to make the 737 Max “one of the safest airplanes ever to fly.’’

His company needs to convince the now heavily scrutinised FAA - as well as skeptical international regulators - that the fix is safe and capable of being used in the Max 8 without requiring costly flight-simulator training for pilots, as the company has promised customers. That could prove tricky in the current environment, said Richard Aboulafia, an aircraft consultant and vice president at Teal Group in Fairfax, Virginia.

“I suspect the time spent so far is less about creating optimal software and more about proving to regulators that it’s OK,’’ Aboulafia said. The tradition of non-US aircraft regulators deferring to the FAA’s judgment calls is “hanging by a thread. The system now has many agencies who are determined to show that they have independent oversight.’’

Cracks are already showing. On Wednesday, a day after an FAA-appointed pilot board’s recommended that the US not mandate simulator training for the Max 8, Canada’s transport minister said the training should be required no matter how long the planes remain grounded.

Software engineers need to ferret out ripple effects and unintended consequences, said Eric Feron, an aerospace software engineer at the Georgia Institute of Technology. “You have to look at the way the human is going to operate the plane. You have to consider the interactions with hardware, and other software,’’ he said. “We want to be sure, if we can be sure, that we have no negative interactions between software systems.’’

MCAS proved vulnerable to those kinds of interactions. It relied on data from just one piece of hardware - a sensor that malfunctioned - before putting a plane into a dive that pilots didn’t see coming.

AI expanding

The use of software, artificial intelligence and automation systems continues to expand, not only to fly planes but to drive cars (and even decide who gets parole or a mortgage). When designed well, such systems can prevent fatigue and help humans make better decisions.

Risks emerge when they aren’t designed to manage the back-and-forth between human and machine, particularly when automation changes in ways the user doesn’t expect. In the case of the Max 8, pilots initially weren’t told the MCAS existed.

“Human-AI teams perform better than either alone, but when the AI is updated its behaviour may violate human expectations,” according to a January paper published by researchers at Microsoft., the University of Washington and the University of Michigan. “The system can’t be the same and also have a new button,” said Walter Lasecki, a University of Michigan professor and one of the study’s co-authors. “If you add a new button, you have to teach people how to use it.”

Balog, the Embry-Riddle professor, said aircraft automation introduces complications such as complacency, with pilots relying too much on computers, as well as a lack of transparency, where pilots don’t know what the computers are doing or why.

“I believe in the pilot understanding what is going on in the cockpit,’’ he said.

Software distrust

Many pilots distrust software, said John Barton, who flies the Max for a major airline. “Software by definition gets in between pilots and the airplane,’’ he said in an email. “Most pilots would prefer to fly the airplane mechanically, where we can feel what’s actually going on with the flight controls and with the airplane.’’

Unlike rival Airbus, Boeing has preferred to give pilots, not automation, the final word on flying its planes. The MCAS software on its Max 8 was an exception.

According to Boeing, the software helped the Max 8 handle like earlier 737s. With a bigger engine, positioned differently, the Max 8 nose can tilt up more than pilots expect, risking a stall. The MCAS system was designed to push it back down automatically. That proved disastrous when malfunctioning sensors on the Lion Air and Ethiopian Airlines flights incorrectly assumed the plane was aimed so high it was in danger of stalling, and pointed it down. The resulting crashes killed a total of 346 people.

The 737 Max should have been grounded after the first crash, said Tom Demetrio, a Chicago lawyer who is suing Boeing on behalf of Lion Air families. “That was the time to tell airlines, do not fly this plane until you hear from us that we know the cause and the cause has been corrected,” he said.

Boeing has said it began working on its software fix immediately, but that the work proved more complicated than initially thought, since the software hovers in the background of critical flight controls.

But some Boeing critics said the company might have moved faster if the first crash hadn’t involved Lion Air, a young airline with a history of maintenance and other troubles. “There were just so many factors that contributed with Lion Air,” said Hans Weber, an aerospace engineer with FAA experience.

Working for months

Then came the March 10 disaster, which involved widely respected Ethiopian Airlines. Two days later, Boeing’s Muilenburg said the company had been working for months on “software enhancements’’ designed to “make an already safe aircraft even safer.’’ On March 13, the U.S. joined the rest of the world in grounding the Max 8.

Two weeks after that, Boeing unveiled its software fix to hundreds of pilots and airline executives in Seattle, saying the company would submit it to the FAA by month’s end, a timeline the company walked back within days.

In his video message Wednesday night, Muilenberg said the company had completed 120 test flights, spending 203 hours in the air checking the reworked system.

The updated software will assess readings from two sensors, turn itself off if they don’t agree and nudge the plane’s nose down if they do. To test the new system and convince regulators, the company ran computer models subjecting the fix to multiple speeds, angles and potential human or machine failures in the lab, in simulators and in a jet outfitted with flight-test equipment.

To Weber and Teal Group’s Aboulafia, the fix only highlights the original software’s flaws.