When Your Self-Driving Car Wants to Be Your Friend, Too

Honda's New Electric Urban Vehicle (NeuV) is a self-driving concept car that uses Honda’s talking Automated Network Assistant, or HANA, to analyze and respond to data the vehicle collects about driver and passenger preferences and behavior. Credit: Courtesy of Honda Motor Co., Ltd.

Advertisement

Thirty-five years ago the TV series Knight Rider envisioned an artificially intelligent car that could develop a friendly rapport with its driver. That 1982 Pontiac Trans Am—also known as the Knight Industries Two Thousand (KITT)—dutifully served as Michael Knight’s crime-fighting partner, monitored his health through sensors in the seat and even used voice analysis to respond to the sarcasm in Knight’s cornball quips. Your next car won’t reach KITT’s level of awareness, wit or empathy—but Honda, Toyota and several other companies really are planning to make AI standard in all the vehicles they produce.

Honda unveiled one of the more ambitious—and fanciful—visions for AI in the cockpit at last week’s U.S. Consumer Electronics Show (CES) in Las Vegas. Its New Electric Urban Vehicle (NeuV) is a self-driving concept car that uses Honda’s talking Automated Network Assistant, or HANA, to analyze and respond to data the vehicle collects about driver and passenger preferences and behavior. HANA would essentially be a rolling virtual assistant in the mold of Amazon Alexa and the similar smartphone assistants that Apple, Google and Microsoft have programmed into their handsets.

Toyota also used CES to introduce a futuristic self-driving car designed to get to know its driver and passengers. The egg-shaped Concept-i car—still very much in the concept phase—would be run by an AI system called Yui whose presence is felt throughout the vehicle, thanks to pulsating mood lighting, massage beads built into the seats and the ability to take control of the wheel if its eye-tracking sensors determine that a driver is not paying enough attention to the road. Other car companies unveiled more concrete plans to integrate existing virtual-assistant technology into their vehicles starting this year. Ford will let drivers access their home Alexa devices from the road; General Motors will make IBM’s Watson software part of its OnStar service; and Mercedes–Benz will be able to access Google Assistant while behind the wheel.

The NeuV and HANA—neither of which is yet slated for production—would go beyond looking up information, making recommendations and even driving autonomously. Honda described one scenario in which HANA would adjust the NeuV’s acceleration, handling, braking and suggested routes to match a driver’s experience level. The vehicle would operate conservatively for newly licensed drivers but offer more horsepower and curvaceous driving routes for those with more experience. The NeuV might also be programmed to help its owners earn extra money when they are at work, the gym or in for the night—either by selling excess battery power back to the electric grid or autonomously functioning as an Uber-like on-demand taxi service. In the latter situation the vehicle would receive requests from nearby pedestrians, take them to their locations and then return to a pick-up point designated by its owner.

HANA would also feature what Honda calls an “emotion engine.” Using an array of cameras and other sensors throughout the NeuV’s cabin, the system would read a driver’s actions, facial expressions, tone of voice and heart rate—cross-referenced against events listed on the driver’s daily calendar—to determine if the person behind the wheel is especially anxious or stressed. If so HANA might create a soothing playlist or recommend a route to work that passes by the driver’s favorite coffee shop, according to the company.

Cars like the NeuV and Concept-i raise questions about whether vehicles will someday be “sentient,” able to perceive and respond to drivers’ and passengers’ emotions. “Sentient tools can sense and make sense of their surroundings, and are socially cognizant of the people using them,” says Brian David Johnson, a futurist who until recently worked for chipmaker Intel Corp. and who late last year wrote a report titled “The Coming Age of Sentient Tools” for consulting firm Frost & Sullivan. An autonomous car processes information and makes independent decisions in order to obey traffic laws, conserve fuel, reduce pollution and above all protect the safety of drivers, passengers and pedestrians. “But what [would make] the autonomous car a sentient tool is when it understands and knows the driver and the passengers in the car on a personal basis, seeing them as individuals,” Johnson says.

But few expect that level of AI to arrive by the early 2020s, when many carmakers are promising highly or fully autonomous vehicles. Machines that can truly sense emotions and display sentient behavior are decades away, says Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence (AI2)—an organization that Microsoft co-founder Paul Allen formed in 2014 to focus on AI’s potential benefits. “Sentience” is not a formal discipline in AI research and is open to interpretation. But from Etzioni’s perspective, “There’s a big gap between a machine being able to categorize a reaction—whether it’s a facial expression or one’s tone of voice—as happy, stressed or neutral, and a machine that might be considered sentient or conscious.”

“AI algorithms are becoming increasingly complex but our devices help us make decisions rather than think for us at this point,” agrees Tim Persons, chief scientist at the U.S. Government Accountability Office, Congress's accounting, evaluation and investigatory arm that prepares research—including a report due out later this year on artificial intelligence. “I tend to shy away from referring to machines as ‘sentient’ because it implies more than what a machine philosophically and existentially is doing,” Persons says. “Although technologies like Alexa, Siri and Google Assistant are being made to look and feel as though they are sentient.”

Honda acknowledges that as currently sketched, NeuV is a composite of many technologies still in the relatively early stages of development. “I don’t want to oversimplify and say that you’ll see all of these elements in production in the near future but I don’t think this is an empty promise either,” says Jarad Hall, design group leader for Honda Advanced Design. The concept for HANA sprung from a partnership that Honda’s research and development division launched last year with Tokyo-based telecom SoftBank Corp. to develop an AI system that could read drivers’ emotions based on data from cameras and other sensors located throughout a vehicle.

If the ability to read and react to a driver’s emotions someday becomes standard equipment, watch out. Remember that KITT had an ejection seat—Michael Knight was never more than an eye roll or smartass remark away from being unfriended in a big way.

Scientific American is part of Springer Nature, which owns or has commercial relations with thousands of scientific publications (many of them can be found at www.springernature.com/us). Scientific American maintains a strict policy of editorial independence in reporting developments in science to our readers.