The Mobile Robot Helper, or MR Helper for short, was developed at Tohoku University and used as a research platform between 1997 and 2004. The research focused on developing controls for smooth, natural, and safe human-robot cooperation which involved collision models of the robot, its partner, and its surroundings. All of the objects in the environment would be modeled in a simulation, and include information like size, shape, weight, and velocity, so that the robot could move around without accidentally ramming into things. Objects that it would manipulate directly would also be modeled, such as a coffee mug, with relevant grasping information so that the robot wouldn’t spill its contents.

In addition to this “Object Model”, the robot would model both itself and its human partner using a “Partner Model” in real-time. This would eliminate potential self-collisions and model the human’s actions and estimate the object’s trajectory. Combining these approaches would allow the robot to determine its course of action during a cooperative task. It could, for example, manipulate an object so as to alleviate the burden on its human counterpart by understanding the object’s load. The force-torque sensors in the robot’s wrists would detect when a human was pushing against it, and react smoothly by taking into account its own joint range limitations.

Examples of cooperative tasks included moving a small table with a human. It could also pick up boxes, and objects on trays, and hand them to a human. Of course, an object’s properties would have to be programmed into the simulation before it could do any of this. The robot had 3 cameras including a stereo camera rig in its head, a 2 DOF neck, 1 DOF hands, an LCD monitor in its chest, and ultrasonic sensors in its base (the Vuton). Professor Kosuge also developed cooperative systems involving a group of robots, and would go on to develop the Partner Ballroom Dance Robot, which could anticipate and react smoothly to a partner’s dance movements.