A Shared Autonomy Interface for Household Devices

As robots begin to enter our homes and workplaces, they will have to deal with the devices and appliances that are already there. Unfortunately, devices that are easy for humans to operate often cause problems for robots. In teleoperation settings, the lack of tactile feedback often makes manipulation of buttons and switches awkward and clumsy. Also, the robot’s gripper often occludes the control, making teleoperation difficult. In the autonomous setting, perception of small buttons and switches is often difficult due to sensor limitations and poor lighting conditions. Adding depth information does not help much, since many of the controls we want to manipulate are small, and often close to the noise threshold of currently-available depth sensors typically installed on a mobile robot. This makes it extremely difficult to segment the controls from the other parts of the device.

In this paper, we present a shared autonomy approach to the operation of physical device controls. A human operator gives high-level guidance, helps identify controls and their locations, and sequences the actions of the robot. Autonomous software on our robot performs the lower-level actions that require closed-loop control, and estimates the exact positions and parameters of controls. We describe the overall system, and then give the results of our initial evaluations, which suggest that the system is effective in operating the controls on a physical device.