SoftBank Robotics’ Pepper humanoid robot, already adept at recognizing human emotions such as joy, anger, or surprise, will soon expand its emotional intelligence to understand more complex cognitive states and expressions, thanks to a partnership announced today by SoftBank and Affectiva.

Affectiva’s Emotion AI will be integrated into Pepper, giving the robot the ability to “detect highly nuanced states based on people’s facial and vocal expressions” in real time, said the companies. While Pepper can already detect emotional cues via its microphones and cameras, the inclusion of the Emotion AI technology will allow it to detect more advanced states.

For example, Pepper will be able to identify cognitive states such as distraction, drowsiness, and “differentiating between a smile and a smirk.” Affectiva said understanding these more complex states will give Pepper more meaningful interactions with people, allowing it to adapt its behavior “to better reflect the way people interact with one another.”

Human-machine interaction: The next generation

Pepper, already seen working in customer service roles in banks, retail stores, museums, and hotels around the world, will be able to expand its companion and concierge abilities through this partnership, Affectiva said.

“But this is only the beginning, especially as Pepper continues to evolve and learns to relate to people in increasingly meaningful ways,” stated Marine Chamoux, affective computing roboticist at SoftBank Robotics. “The partnership really signifies the next generation of human-machine interaction, as we approach a point where our interactions with devices and robots like Pepper more closely mirrors how people interact with one another.”

Pepper’s new emotional intelligence abilities would be able to detect your reaction to its wearing a Washington Redskins jersey. Credit: Eugene Demaitre

Dr. Rana el Kaliouby, co-founder and CEO of Affectiva, said there would be a need for deeper understanding and mutual trust between humans and robots as the machines take on more interactive roles in healthcare, homes, and retail environments.

“Just as people interact with one another based on social and emotional cues, robots need to have that same social awareness in order to truly be effective as coworkers or companions,” el Kaliouby said.

Affectiva, which spun out of the MIT Media Lab, utilizes machine learning, deep learning, computer vision, and speech science for its emotional intelligence technology, dubbed Emotion AI. The company said it has the “world’s largest emotion data repository,” with more than 7 million faces analyzed across 87 countries. The company also works with the automotive industry to provide multi-modal driver state monitoring and in-cabin mood sensing.

The two companies will be discussing the partnership and their vision for social robotics at Affectiva’s Emotion AI Summit, to be held on Thursday, Sept. 6, in Boston.

Keith Shaw is the Editor-in-chief for Robotics Business Review. Prior to joining EH Media, he worked as an editor for Network World, Computerworld and various newspapers across Massachusetts, New York, and Florida. He holds a degree in journalism from Syracuse University.