BEHAVIORS.AI - WORKSHOP

The workshop was organized on the occasion of the inauguration of the Labcom Behaviors.ai, a joint laboratory between the LIRIS Laboratory and the Hoomano company, funded by the ANR. This workshop was colocated with the SIDO 2019 event, which topic this year was IoT, AI and Robotics, and where more than 400 exhibitors were present. The workshop offered the opportunity to researchers, practitioners, students and all persons interested by the topic of AI and social robotics, to discover the behaviors.ai project and attend scientific talks presented by prestigious experts in the area of Human-Robot Interaction.

Pr. Yukie Nagai (University of Tokyo) during the inauguration

SPEAKERS

DR. YUKIE NAGAI

Project Professor, University of Tokyo

DEVELOPMENT OF SOCIAL COGNITION IN ROBOTS

Dr. Yukie Nagai has been investigating underlying neural mechanisms for social cognitive development by means of computational approach. She designs neural network models for robots to learn to acquire cognitive functions such as self-other cognition, estimation of others’ intention and emotion, altruism, and so on based on her theory of predictive learning. The simulator reproducing atypical perception in autism spectrum disorder (ASD), which has been developed by her group, greatly impacts on the society as it enables people with and without ASD to better understand potential causes for social difficulties. She is the research director of JST CREST Cognitive Mirroring since December 2016.

BAHAR IRFAN

University of Plymouth, SoftBank Robotics Europe

Bahar Irfan is an Early-Stage Researcher and a PhD candidate at the Centre for Robotics and Neural Systems, University of Plymouth and AI Lab, SoftBank Robotics Europe, in the joint Marie Skłodowska-Curie ITN project APRIL ( http://april-robots.org ). Her work focuses on multi-modal person recognition and personalisation in long-term human-robot interaction for conversational agents. She is also working in a joint project on socially assistive robotics with Colombian School of Engineering Julio Garavito jointly funded by the Royal Academy of Engineering.

MULTI-MODAL PERSONALISATION IN LONG-TERM HUMAN-ROBOT INTERACTION

In the future, long-term human-robot interaction (HRI) will be integral in domestic applications, education, and rehabilitation. However, user engagement can decrease over time if the interaction is based on a fixed set of behaviours. Personalisation can improve user engagement and facilitate building rapport and trust by adapting to the user’s personality, preferences or needs. This talk will touch on two approaches of personalisation in long-term HRI: user identification, and progress tracking for socially assistive robotics.