Follow me – Design for person-following robots

Mrs. Katz is an older lady living on her own. She is eighty years old, smart and witty as she has always been, but physically fragile. It is important for her to remain in her own home, and maintain an independent life. This is why she has “Rupert”, a one arm multi-functional platform robot that serves as her aid. While finishing her breakfast at the small dining table in the kitchen, Rupert is waiting in its position by the kitchen’s doorpost for new instructions. Knowing that her grandson and family are coming for a visit, Mrs. Katz wants to replace the tablecloth on her main dining table. She needs to get it from the closet in her spare bedroom. “Follow me” she signals to Rupert as she heads toward the corridor.

This is where the story ends and our research begins.

Here is how it looks, a person-following robot

The Follow me setup

For Follow me we use a Pioneer LX robot run by ROS on a Linux OS. The Pioneer LX is equipped with an integrated on-board computer. The robot includes 4 front and rear sonar sensors, and a forward bumper panel containing 2 sensors. Its laser rangefinder has a 250◦ field of view and a maximum range of 15 meters . The laser scanning is two-dimensional and located about 190 mm above the surface. An external Kinect 2.0 was added to the robot and was located at about 1.5m above ground. The Kinect is connected to an external laptop computer, on which we execute our person tracking and following code. The robot commands were sent to the robot’s onboard computer using a router.

The work here is part of the “Follow Me” Final Technical report, Israeli Ministry of Science and Technology (MOST) grant # 3-12060. Final Technical report (in Hebrew) will be available on June 1, 2018.

In recent years there has been an increase in the use of service robots in a many applications. Consequently, there is a growing need to encourage and perfect human-robot interactions. One common interaction that has been studied is robotic human-following. In this study, we aimed to deepen our understanding of the interaction experienced when a robot follows a human in various environments, both on straight and curved paths, by measuring and assessing tracking quality and human emotion. The complexity of the study derived from the intricacy of the technological environment, which included a Pioneer LX robot run by ROS on a Linux OS. In order to prevent the robot from falsely identifying objects in the environment as human legs, we conducted preliminary trials in which we tested the robot’s sensitivity to noise and derived optimal parameters of sensitivity. During these preliminary trials, we also tested various values of robot acceleration and distance from human subject, and arrived at two sets of values worth testing. In order to assess the effect of environment on the human-following experience, we tested the interaction in both a corridor-like environment, and an open-space environment. Depending on the experimental condition, we varied the values of robot acceleration and distance. In some conditions, we also asked subjects to perform a secondary task while walking, in order to simulate a common scenario in which the person walking is focused on something else while walking. Twenty five subjects were tested in each environment. In order to avoid variance in tracking quality all subjects were required to wear the high boots. After each trial, subjects answered a post-trial questionnaire that assessed their perception of the robot, interaction with robot, and tracking quality, subjects also answered a final questionnaire comparing experiences between trials.

Statistical analysis on the results of the questionnaires included Anova tests. Results indicate that in the open-space environment, subjects reported feeling more comfortable and relaxed than in the corridor environment. The tracking quality was also better, as dictated by the number of times the robot lost the human while tracking. Furthermore, one of the parameter sets was found superior to the other during trials that did not include a task. This set was also found to be preferred by subjects in both environmental conditions, as can be seen by their ratings of comfort and naturalness of walk. In trials that included a task while walking, subjects felt they adjusted themselves less to the robot and that the robot followed more efficiently. They also felt more natural and comfortable albeit a bit more stressed. Lastly, subjects felt more relaxed with subsequent trials, and reported that the robot felt more adjusted to their movements.

In all contexts of robot-user interaction, person-following robots should accompany people in a safe, functional, usable, and pleasurable way. To improve the design of a person-following robot, this study evaluates the influence of user tasks on people’s preferences for its following angle and their perceptions of its behavior. The method used in this paper is an experiment in which 32 participants were followed by a robot for a total of six walking trials: at three different following angles, once with an auditory task and once with a visual task. Results indicate that the type of user task influenced participant preferences and perceptions. For the auditory task, participant preferences were expressed by more satisfaction with the robot’s following behavior as the following angle increased. In contrast, for the visual task, participants were less satisfied with the robot’s following behavior as the following angle increased. In addition, participant perception of the robot and its following behavior was more positive for the visual task compared to the auditory task. Additional research is required to better understand how human preferences and perceptions depend on task modality or task complexity.

Keywords: Person-following robot, following angle, user task

Using gestures to instruct a robot – a study of natural interaction with elderly people

A robot’s ability to recognize hand gestures and voice commands may promote natural interaction between humans and robots. The aim of this study was to examine which voice commands and/or hand gestures would elderly people naturally select to guide the motions of a robot. Twelve elders aged 70-77 participated in this 2-parts WoZ study. Five navigational commands were evaluated: Stop, Follow me, Move right, Move left and Move forward, in two forms of experimentation. The first part of the experiment implicitly elicited the use of each command using a simulated grocery store environment. In the second part, participants were explicitly asked to guide the robot to perform the 5 different navigational commands. The data was analyzed to consider age-related difference relative to previous experiments with younger adults, and task-related differences. Results indicated that elderly people vary more in how they chose to instruct the robot than young people. The results could act as a baseline for future development of a natural, user-friendly aural and gesture vocabulary that is suitable for older users.