Learning Everyday Tasks from Human Demonstration

Submitted by admin on Mon, 12/28/2009 - 08:27

Peter Pastor, a PhD student at USC, spent the past three months
developing software that allows the PR2 to learn new motor skills from human demonstration. In particular, the robot learned how to grasp, pour, and place beverage containers after just a single
demonstration. Peter focused on tasks like turning a door handle or grasping a cup -- tasks that personal robots like PR2 will perform over and over again. Instead of requiring new trajectory planning each
time a common task is encountered, the presented approach enables the robot to build up a library of movements that can be used to execute
these common goals. For this library to be useful, learned movements must be generalizable to new goal poses. In real life, the robot will never face the exact same situation twice. Therefore, the learned movements must be encoded in such a way that they can be adapted to different start and goal positions.

Peter used Dynamic Movement Primitives (DMPs), which allow the robot to encode movement
plans. The parameters of these DMPs can be learned efficiently from a
single demonstration, allowing a user to teach the PR2 new movements
within seconds. Thus, the presented imitation learning set-up allows a user to teach discrete movements, like a grasping, placing, and releasing movement, and then apply these motions to manipulate several objects on a table. This obviates the need to plan a new trajectory
every single time a motion is reused. Furthermore, the DMPs allow the
robot to complete its task even when the goal is changed on-the-fly.