We present an interface that allows users to direct a mobile
manipulation robot in tabletop pick-and-place tasks using only their
head motions and a single button. The system uses an estimate of the
user's head pose and a 3d world model maintained by the robot to
determine where the user is pointing their head. We give the results
of some preliminary evaluations of our system, which suggest that it
is both intuitive and effective. We also describe an example
trash-sorting application where the user directs a PR2 robot sort
objects in to "trash" and "recycle" piles.