When technology bites back

From the 1912 sinking of RMS Titanic to the Chernobyl nuclear accident 30 years ago, technology has repeatedly confounded the confidence of its creators.

But it is still somehow a surprise today when we are led astray by our closest technological companions — mobile phones, GPS navigators, self-driving cars, or software that mimics human speech to interact online with people who want a chat.

“We are increasingly surrounded by machines that are meant to make our lives easier,” said French philosopher Jean-Michel Besnier of the Paris-based National Centre for Scientific Research.

“The autonomous car, for example, is supposed to improve traffic, safety and give us more time. But man may feel increasingly that he is losing the initiative, that he is no longer at the controls and, because of it, no longer responsible.”

There is no end of GPS mishaps to attest to this.

In March last year, a bus driver taking 50 Belgian tourists to a French ski resort in the Alps selected the wrong ‘La Plagne’ out of three similarly named locations on his GPS. At no point, apparently, did he lose faith in the machine as it led him 600 kilometres (400 miles) in the wrong direction until passengers could spot the Mediterranean.

– Bloody clashes –
Four months later, a 59-year-old bus driver said he was just following his GPS when he drove a trans-European bus with 58 passengers under a low bridge in northern France, shearing off the top and seriously injuring six people.

Last month, two Israeli soldiers using the mobile phone navigating app Waze, which relies on users for real-time updates, mistakenly drove into a Palestinian refugee camp, sparking bloody clashes. Waze said the drivers were to blame for deviating from the suggested route and turning off a setting that warns of dangerous areas.

Indeed, our adaption to new technology is frequently blamed for mishaps and even serious accidents, rather than the technology itself.

The World Health Organisation warns that drivers using a mobile phone are four times more likely to be involved in a crash.

In other circumstances, too, the results of such distraction can be fatal.

In Spain’s famed Pamplona bullrun, a 32-year-old man was killed in August last year while he filmed the running of the bulls with his mobile phone and was surprised by one of the animals, which gored him from behind.

– Deadly train crash –
In one of the worst disasters blamed in part on mobile phone distraction, the driver of a Spanish train that crashed on July 24, 2013 outside the northern city of Santiago de Compostela was speaking on a mobile to a colleague onboard just before the train flew off the tracks and ploughed into a concrete siding, killing 79 people.

Google took part of the blame in February after a self-driving car manoeuvred around some sandbags and was hit at low speed by a bus in Mountain View, California.

“This accident is more proof that robot car technology is not ready for auto pilot,” Consumer Watchdog privacy project director John Simpson said at the time.

Such risks cannot be blamed only on immature technology, said Valerie Peugeot, who looks into future developments at French telecoms leader Orange’s research and development network, Orange Labs. “We delegate to technology choices that historically were human choices,” she warned.

– Racist insults –
Even the world’s biggest technology firms can get it horribly wrong.

Last month, Microsoft had to withdraw “bot” software, named Tay, that it had designed to respond like a teenage girl to written comments from other users on Twitter.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” a Microsoft official said.

After being led down the wrong path by other users, Tay’s tweets ranged from support for Nazis and Donald Trump to sexual comments and insults aimed at women and blacks.

“C U soon humans need sleep now so many conversations today,” Tay said in its final post on Twitter.