KUKA Innovation Award 2018

The winners of this year's Innovation Award on the "Real-World Interaction Challenge" have been announced: Team CoAware from Italy was able to convince the international jury with their demonstration in the areas of dynamic human modeling, image processing and interaction control for robots.

Team CoAware wins Innovation Award 2018

The winners of this year's Innovation Award on the "Real-World Interaction Challenge" have been announced: it’s the members of Team CoAware from Italy. The experts from the Istituto Italiano di Tecnologia were able to convince the international jury with their demonstration in the areas of dynamic human modeling, image processing and interaction control for robots. The team can now look forward to a prize of 20,000 euros.

At the booth of the Innovation Award, the team demonstrated how motion and ergonomics can be monitored and optimized in real time through a dynamic model – based on a LBR iiwa and proprietary software. With a gripper hand, the lightweight independently gripped for objects such as a metallic cylinder and led it to a worker who could then handle the workpiece ergonomically using a polishing machine (more on this in the video playlist).

Convinced the jury: Team CoAware from Istituto Italiano di Tecnologia in Italy

"The idea is to make robots collaborate with humans – not to avoid them. This requires a lot of data fusion and is the biggest challenge we have", one of the team members said before the decision was announced.

Innovation Award 2018: See how the finalists excited trade visitors at the Hannover Messe

Real-World Interaction Challenge: The full range of ideas

KUKA is known above all for its versatile and powerful industrial robots. But the sensitive lightweight robot LBR iiwa can also master many tasks that arise in real life. Scenarios that seem not complicated to humans – serving ice cream, combing hair, layering wood or measuring liquids – require the use of complex algorithms and numerous sensors. At the 2018 Innovation Award, the lightweight robots of the finalist teams mastered all these challenges with flying colors:

Team Alberta

The Robot Vision research group from the University of Alberta is working on image guided motion control of robot arms and hands. This team is implementing processes that allow the robot to learn from humans by means of observation, gestures and dialog so that future robot systems will be able to work with humans even in unstructured environments. The goal is for the robot, with this acquired knowledge, to be in a position to grip various everyday objects, workpieces and components and to sort them independently, even if new, unknown objects are included.

"We are thinking the fair is like a window to the future, a place where you can present and discover real innovations. For us, it is a great opportunity to build a bridge between research and industry."

Team Alberta - University of Alberta, Canada

Team CRoW

The team from the Institute for Computational Design and Construction at the University of Stuttgart combines expertise in algorithmic geometry and the development of robot-based material processing systems. In this project, it aims to provide small and medium-sized companies with access to robot-assisted methods of work. The concept comprises a collaborative robot workbench with an augmented reality interface. The project will demonstrate a woodworking scenario in which a robot assists a human.

"Existing robotic interfaces and workflows require extensive expert knowledge and experience. We believe that they should radically change. With the help of Augmented Reality, non-expert users should be able to engage with robots as well."

Team CRoW - University of Stuttgart, Germany

Team DynaMaP

The employees of Draper, a non-profit R&D organization, the Robot Locomotion Group at MIT and the Agile Robotics Lab in Harvard, aim to show that robots can orient themselves and execute tasks in unstructured environments. For this, the team uses neural networks to determine the positions and interactive dynamics of objects in the environment. The team’s developments are demonstrated by means of a maintenance task that is typically carried out by a human.

"If a robot is working next to a person at a cluttered workbench or in a crowded kitchen, it needs to be able to effectively recognize objects and to manipulate them – without really knowing anything about them beforehand.

Team DynaMap - MIT & Harvard University, USA

Team UPEnD

Four researchers from the University of Pennsylvania combine their expertise from various robotics disciplines ranging from image processing and manipulation planning to system integration. The team is tackling the challenges facing a robotic system working with containers filled with liquids and used for exact dosing, for example in the pharmaceutical industry. The robot is controlled using sensors on the robot arm and two stereo cameras.

"We believe that our project really embodies the goal of KUKA: To have robot team mates that can work effectively and safely hand in hand with human colleagues."

Team UPEnD - University of Pennsylvania, USA

KUKA Innovation Award 2018 - The Challenge

This year’s competition focused on robots that can interact outside as well as inside the industrial environment, with emphasis on direct support for humans. The concept and presentation of the applications should be as realistic as possible.

To enable fair comparison of the concepts, KUKA provided each of the final teams with a flexFellow – a mobile robotic unit on which an LBR iiwa (a sensitive lightweight robot for safe human-robot collaboration) is mounted. Beyond this, a 3D vision system from the start-up company Roboception has been available.