Human error at the root of Google Street View AI malfunction

Add to the mix the biases that it inherited from its human creators and HAL is not a ‘killer’ or a ‘villain’, it is simply a piece of technology that is functioning in a manner that its creators didn’t intend for it, but because of their own limitations, ingrained it into its template.

In 2001, HAL attempts to kill two astronauts to continue performing its assigned task.

A couple of articles on a piece of Artificial Intelligence (AI) predicated on a neural network that was found to be functioning in a manner that was not predictable for its creators have, with no malice in mind, dubbed this as “cheating” by the AI unit. As per TechCrunch, the AI unit being used to convert satellite images of roads and neighbourhoods into Google Maps’ Street View images and vice versa adopted a path that, given how its creators had defined ‘success’ for it, side-stepped a key requirement of the task it was given. Now, this may get some excited and some others terrified, but what it actually demonstrates is simply the unpredictability of artificial intelligence predicated on human intelligence designing it.

In the present case, while the AI unit had been tasked with creating the Street View map from aerial images and vice versa, it learnt to act in a manner that the creators hadn’t foreseen. In fact, as the TechCrunch article has pointed out, citing HAL (Heuristically programmed ALgorithmic computer) that features Arthur C. Clarke’s immortal sci-fi work, 2001: A Space Odyssey, “It can only be attributable to human error.”

In 2001, HAL attempts to kill two astronauts to continue performing its assigned task. But, it is perhaps important to keep in mind that HAL is faced with a conflicting set of instructions: the general one being to pass on information accurately and the mission-specific one of keeping information from the characters to facilitate the completion of the mission. Add to the mix the biases that it inherited from its human creators and HAL is not a ‘killer’ or a ‘villain’, it is simply a piece of technology that is functioning in a manner that its creators didn’t intend for it, but because of their own limitations, ingrained it into its template.