SoftBank Taps Affectiva to Boost Pepper Robot’s Emotional IQ

Teaching robots to recognize human emotions and react appropriately is one of the grandest ambitions in robotics. Companies have made progress on this front in recent years, thanks in part to advances in machine learning, computer vision, speech recognition, and related technologies. But there is a long way to go.

Now, one of the big players in the field, SoftBank Robotics, is turning to a Boston startup to boost the “emotional intelligence” of its humanoid robot, Pepper. SoftBank Robotics said Tuesday it has teamed up with Affectiva to integrate the startup’s “emotion A.I.” software into Pepper to improve the robot’s ability to interpret people’s emotions and adapt its behavior in real time.

Financial terms of the partnership weren’t disclosed. The deal highlights the ambitions of SoftBank Robotics’ parent company, the Japanese tech giant SoftBank Group, to make advances in nearly every aspect of robotics, from computing to A.I. to the machines themselves. SoftBank’s other relevant deals in recent years include the acquisitions of robotics firms Boston Dynamics and Schaft from Google’s parent company Alphabet, the purchase of chipmaker Arm, and investments in companies such as Fetch Robotics.

Pepper was launched in 2015 by SoftBank and Aldebaran Robotics, a France-based humanoid robot developer that has been majority-owned by SoftBank since 2012. (Aldebaran was later renamed SoftBank Robotics.) More than 12,000 Pepper robots have been sold, Forbes reported in May. The machines are serving as companions to nursing home residents, as well as greeters in retail stores, banks, and hotels. Pepper has even been tapped to chant Buddhist sutras in funerals, in place of a human priest.

Since it was first announced in 2014, Pepper has been billed as having the ability to read human emotions. According to SoftBank Robotics’ website, Pepper’s cameras, microphones, and software enable it to perceive smiles, frowns, tone of voice, and body language, such as the tilt of a person’s head. The robot then reacts accordingly, perhaps by acting delighted if it thinks the person is happy or trying to comfort someone if he or she seems sad.

In a press release, SoftBank Robotics said Affectiva’s software will expand Pepper’s emotion-detection capabilities. Affectiva says its technology can identify basic emotions such as joy, anger, and surprise, as well as “more complex cognitive states and expressions such as distraction, drowsiness, or differentiating between a smile and a smirk.”

“There’s a significant opportunity for robots like Pepper to improve the way we work and live, as we’ve seen through the many roles Pepper has already taken on as a companion and a concierge,” said Marine Chamoux, an affective computing roboticist with SoftBank Robotics, in a prepared statement. “But this is only the beginning—especially as Pepper continues to evolve and learns to relate to people in increasingly meaningful ways.”

The partnership fits into a broader shift in the way humans and machines interact, as we move beyond typing commands on keyboards. Internet-connected speakers from Amazon, Apple, Google, and others react to voice commands. Pepper, Jibo, and other robots have vocal and facial recognition skills. These capabilities are far from perfect. But as robots become more sophisticated and commonplace, companies such as Affectiva and Eyeris are attempting to make machines’ interactions with people more natural and human-like.

“As robots take on increasingly interactive roles with humans in many corners of society—spanning healthcare, retail, and even entering our homes—there’s a critical need for us to foster a deeper understanding and mutual trust between people and robots,” said Rana el Kaliouby, Affectiva’s co-founder and CEO, in a prepared statement. “Just as people interact with one another based on social and emotional cues, robots need to have that same social awareness in order to truly be effective as coworkers or companions.”