Robots to Learn Human Emotions

One of the things that humans will always have over robots is, for better or worse, the complexity of emotions. Computers can think better logically and make faster and more accurate calculations, but they can’t feel. There’s a huge gap there.

Into that gap are wading a group of European scientists on a project titled Feelix Growing. The group of 25 is from six different countries, and the project will last for three years. Their ultimate goal is to learn better how to help robots respond emotionally to humans.

The scientists are focusing initially on simple things that spur responses, from people and from robots. The group will engineer robots to adapt their behavior to the sensory input that they get from humans, including using an artificial neural network to build a well of emotional responses based on experiential data. The video cameras, audio and contact sensors, and distance calculators will collate information such as human facial expressions and other body language.

The project has been running for several weeks now, and some of the robots are already exhibiting imprinted behavior, such as looking to people for the kind of support network that parents give their children.