AILA's robotic hands are designed to perform tasks just like a human.
DFKI

There's a lot to do aboard the International Space Station (ISS). There are science experiments to be performed and cared for; at least two hours of exercise per astronaut per day to try to prevent loss of bone density and muscle; space walks to work outside the space station; and routine maintenance and care of the station itself.

If a robot could be employed to take care of the more menial tasks, the astronauts could spend more time on jobs that require a little more intuition, or they could assist astronauts with tasks that require two operators. That's what the BesMan AILA robot could hopefully achieve.

Under development at the German Research Centre for Artificial Intelligence in collaboration with the University of Bremen, AILA is designed to perform manipulation procedures involving the use of one or two arms. As such, the robot is equipped with two arms with articulated fingers, like a human hand (because the ISS' interfaces are designed for human manipulation). She has 32 degrees of freedom, including seven in each of the arms, four in the torso and two in the head.

She is also attached to a wheeled platform, which allows her to move. Each of the six wheels has two degrees of freedom. In orbital microgravity, obviously, this would not be an ideal mode of locomotion; for Earth-based simulations and situations, however, it is fine.

Related articles

In her head, AILA has two Prosilica GC780C cameras for stereo robotic vision; a short-ranged Hoyuko laser scanner in her chest and a Mesa Swiss Ranger 4000 3D Time of Flight camera in her belly; a mini-ITX board and graphics card for vision processing, located in her torso; two 3.5-inch embedded PCs, one for motion control, located in her head, the other controlling the wheeled platform; and Gigabit Ethernet routed through to two five-port switches to connect her head cameras and computers together and to the outside world.

To train AILA, the first step is to develop concepts based on human behaviours. This will be taught to the robot using imitation and reinforcement learning techniques: the operator will show the robot what to do, and the robot will copy the actions. Ideally, this would be able to work on non-humanoid robots, too.

This work is still ongoing, but it has progressed enough that AILA could complete her first demonstration, performing a task in a simulation of the ISS.

"On the one side, a software framework and an accompanying embedded domain specific language (eDSL) have been developed to describe and control robot manipulation behaviours and keep their descriptions (and the descriptions of the tasks) independent of a particular robot," the team wrote. "Thus, the same robot high-level behaviour can be reused on robots of different morphology and/or hardware."

That is, the software that allows the robot to complete the task is not specific just to AILA. Ideally, any robot could run on the same software to complete the same task, even a non-humanoid robot.

But AILA's physical frame is also important. The team continued, "On the other side, a whole-body reactive control approach is used in order to automatically find an optimal usage [of] all the available degrees of freedom at runtime."

In the video below, AILA autonomously completes her task. First, she must activate a series of switches. Then, she has to turn a wheel. Finally, she has to deactivate the same switches she activated in the first part of the task.