Tech: OK Google, er, cancel the driverless taxi

AI can imitate speech, but it is still liable to come a cropper

“OUR MISSION for our assistant is to help you get things done,” Sundar Pichai, Google’s chief executive, told its recent developers’ conference. He introduced Google Duplex, a human-voiced digital assistant, which booked a dinner reservation and a haircut with all the authentic “mmm-hm” and “um” pauses you and I might utter.

People were taken aback by how well the technology mimicked speech, which is what supposedly makes it different from other digital assistants, such as Apple’s Siri or Amazon’s Alexa. But it may not be all it’s cracked up to be — yet. It hasn’t been tested in the wild, and the jury is out on whether artificial intelligence (AI) is ready to tackle the complexity of two-way conversation.

Machines are good at understanding the intricacies of a finite environment — a chess board, say — but are easily tripped up if they meet a situation for the first time. The more expansive the territory, the harder it is for a robot to map out all possible scenarios.

That is why Alexa is good at shopping. It knows Amazon’s products intimately. If you ask for a snack, Alexa may suggest ice cream. But when I ask for a buttercream cake recipe, it suggests one for pecan sour cream coffee cake — not what I want.

“They’re very dependent on matching things against their training data,” says Gary Marcus, professor of neural science and psychology at New York University. A self-driving car can identify and navigate round other vehicles and road signs, but if it sees a van on a billboard, it may be stumped. “The driverless car doesn’t know what to make of a van that’s 25ft above the road,” he says. “It doesn’t have enough situational awareness to navigate the real world.”

Marcus says we should be worried by mediocre AI systems. “There are few domains [where] you can fully trust AI, because it doesn’t understand the way the world works,” he says. “If you have an advertising system that doesn’t understand … then it’s not that big a deal. But if you have a driverless car that doesn’t understand the way the world works, that’s dangerous.”