When new technology goes badly wrong, humans carry the can

Madeleine Clare Elish

Who is to blame when an automated system fails? John Naughton covers Research Lead Madeleine Clare Elish’s academic article on “moral crumple zones” for The Guardian.

“’While the crumple zone in a car is meant to protect the human driver,’ she writes, ‘the moral crumple zone protects the integrity of the technological system, at the expense of the nearest human operator. What is unique about the concept of a moral crumple zone is that it highlights how structural features of a system and the media’s portrayal of accidents may inadvertently take advantage of human operators (and their tendency to become ‘liability sponges’) to fill the gaps in accountability that may arise in the context of new and complex systems.’”