Projects

Integration of complementary sensing modality has been a popular choice enabling highly robust and accurate robot state estimation and mapping. In particular, Inertial Measurement Units have become smaller and cheaper and are present in most of today's mobile devices. Various use-case combining an IMU with GPS, odometry, magnetic compasses and also visual correspondences have been suggested and will continue to be researched upon.

Building scalable and high-quality maps along with the possibility to localise in them is a primary need for truly autonomous robots, if we want them to achieve complex tasks. Recently, the use of cameras (including colour and depth cameras, such as the Microsoft Kinect) to this end has become an increasingly popular choice, due to the richness of information about the environment that is present in the images. Simultaneous Localisation And Mapping (SLAM) from imagery is an ongoing challenge, in particular when it comes to scalability in space and time, along with higher spatial map density and higher accuracy, while maintaining real-time operation.

For meaningful interaction of a mobile robot with its invorenment, as well as with human operators, availabilty of accurate pose and dense maps will not be enough: instead, we need semantic information and a segmentation into 3-dimensional objects and hierarchies of objects -- importantly, concepts that are shared with humans. Moreover, the motion of individual objects, or even dynamic morphing, e.g. of a person performing a task, need to be well understood by a robot in order to make predictions, operate safely, and relate to environment and tasks.

Tomorrow's domestic (and other) robots are supposed to interact with their environments in a meaningful way. We thus tackle the challenge of combining robot estimation (localisation and mapping) with motion control. Using the robot mobility as well as the degrees of freedom of manipulators, we can support active exploration/mapping as well as recognition tasks.

We demonstrate the real-world applicability of state estimation and mapping as well as full navigation algorithms by deployment on mobile robotic platforms. Published algorithms have been run on-board Unmanned Aerial Systems (UAS), both on multicopter platforms as well as on solar aeroplanes. Robustness, accuracy, as well as real-time capabilities under computational constraints pose main challenges in this respect, as we do only want to rely on on-board computation for safety reasons, since communication links may break.