Don’t worry, cars aren’t going all HAL 9000 on us, at least not yet. The worst your car might do is accidentally turn the stereo volume to maximum when what you asked it to do was, “phone Max.”

We’ve all been burned before by supposedly smart machines that tried and failed to help us. The adverts promise an “intelligent personal assistant” to make your life easier and what you get is, more often than not, an infuriatingly stupid and unhelpful device.

Voice control is getting better, but slowly. While its present iterations may not be good for much more checking the weather on your iPhone, voice control has the potential to radically change the way we interact with our cars — possibly for the better.

The tipping point, when voice control becomes more helpful than hindrance, is nigh. This year, Mercedes, BMW, Audi, Fiat-Chrysler and others are beginning to offer natural voice control, and more automakers are sure to follow.

Dieter May, BMW’s top expert on these things, estimates that by 2021 voice will be the main mode of interaction for secondary functions, things that aren’t steering, accelerating, and braking. “Right now, we are continuing on a multi-modal approach, which means you still have the iDrive controller, you have the touchscreen, then you have gesture and then voice,” said May, senior VP of digital products and services. “I think the real killer will be the voice. Because it’s so natural you don’t need to mess around — the distraction is the least.”

A question of safety

For all of its novelty, reducing distraction could be the biggest benefit of this new technology. The stakes in a car are much higher than when you’re sitting in your living room asking Alexa to play some Kenny G.

Driver distraction is a factor in four million car crashes in North America each year, according to the RCMP.

Tech companies don’t want us to stop using our phones in cars, they want to make it safer (and legal) by getting us to talk to our devices instead of touch them.

“We believe voice is the future,” said Ned Curic, VP of Alexa Auto at Amazon. With more than 100-million Alexa-enabled devices already in people’s homes, the company is pushing into cars through its new Echo Auto mobile dongle. “We like the idea of Alexa providing simpler, more intuitive access to things like navigation and entertainment while allowing you to keep your hands on the wheel and eyes on the road,” he said.

Not everyone agrees voice control reduces distraction though.

“It’s not your eyes that are distracted, it's your brain,” the CAA cautions on its website, of the dangers of voice commands. "And often for longer than if you were pressing buttons.”

“It is highly distracting to use hands-free voice commands to dial phone numbers, call contacts, change music and send texts with Microsoft Cortana, Apple Siri and Google Now smartphone personal assistants,” concluded a 2015 study by University of Utah researchers, commissioned by the AAA Foundation for Traffic Safety.

The report found it took up to 27 seconds for drivers to regain full attention on the road after issuing voice commands, and that older drivers are much more distracted when issuing voice commands than younger drivers.

Voice control technology has improved a lot since 2015, but that doesn’t guarantee it has become less distracting. As with everything in the free-for-all that is the technological frontier, our appetites for the latest connected gadget, app or service — and these companies’ desires to sell them to us — move far faster than research on their potential downsides

A brief history of talking to machines

Much like Dr. Frankenstein, we humans have long desired to converse with our creations. At the 1962 World’s Fair, IBM demonstrated the Shoebox, a machine that could do simple math and responded to six control words and 10 digits, spoken into a microphone. By the mid 1980s, IBM had an experimental program that recognized and transcribed 20,000 spoken words.

In the aughts, the first generation of Ford’s in-car SYNC voice control system only responded to 100 pre-programmed commands, and lord help you if didn’t enunciate like an English teacher at a lectern. Then, in 2010 Ford bragged that SYNC was being upgraded to understood 10,000 commands. It was still terrible.

Using those early voice control systems was like learning a coding language just to turn up the heater. More recently, the advent of cars with a high-speed Internet connection allowed for cloud-based assistants that are not only better at understanding normal speech, but smarter when it comes time to provide answers.

“It’s a combination of enormous advances in cloud computing, machine learning, automatic speech recognition and natural language understanding,” said Amazon’s Ned Curic. “The latter two are especially important, as they’re the technologies that will allow drivers to interact with Alexa in a natural, conversational manner rather than memorizing specific commands for every task.”

With more than 100-million Alexa-enabled devices already in people’s homes, Amazon is pushing into cars through its new Echo Auto mobile dongle. - Contributed

Understanding our commands is no small feat, but providing useful responses to them is an even more formidable challenge. “In our cars, it’s hardly that it doesn’t understand what you’re saying; it’s more that it doesn’t have the answer,” said BMW’s Dieter May. To address the latter issue, the company will issue regular over-the-air updates, giving the in-car assistant more abilities.

“People need to understand that this is a process they engage in and that things constantly get better,” May said. When it comes to cellphones, we understand that. Annual operating system updates bring major new features. Customers look forward to a new version of iOS or Android.

“It’s a bit different when you have a phone or (Amazon) Echo device which evolves, versus a $50,000 car, which is a pretty big investment,” May continued. “You’ve always had that mindset that [a car] is finished. But it’s a new world.” Cars of the future will evolve as they age.

HAL 9000, but nice

Google says its Assistant will soon be on 1-billion devices globally, making it even more ubiquitous than Amazon’s Alexa. Google is pushing into the automobile with third-party Assistant-enabled gadgets, like the Anker Roav Bolt that plugs into any car’s 12-volt outlet. Samsung is working on getting its assistant, Bixby, into cars and recently announced a collaboration with Uber. Bixby could one day, for example, link to your smart-fridge and let you know if you need to buy more milk on the way home.

Most major automakers, meanwhile, are working on voice-assistants of their own, not wanting to cede the in-car space to tech companies.

If all goes to plan, by around 2021, you’ll have all these AI assistants talking to each other in the background, all working to make your life easier and — hopefully — reducing driver distraction in the process. With such convenience will come a whole host of new issues, such as keeping all your personal info secure from car hackers, and the spectre of people trying to order toilet paper online while driving down the highway.

“I’m sorry Matt, I can’t place that order right now. Please keep your eyes on the road.”