Our group has developed some of the first fully autonomous micro-aerial vehicle (MAV) systems that are capable of self-directed exploration in GPS-Denied environments. The vehicles leverage simultaneous localization and mapping (SLAM) algorithms to localize themselves within environmental maps that are estimated concurrently.

Visual Navigation For GPS-Denied Flight

Cameras are powerful sensors for robotic navigation because they are lightweight, low-power, and provide dense environmental information at high frame rates. Our recent work has focused on using monocular (i.e. single lens) cameras as the primary navigation sensor for MAV flight, leveraging state-of-the-art computer vision algorithms capable of real-time operation on computationally constrained platforms. Our work has spanned both sparse keypoint methods as well as dense and semi-dense methods for real-time simultaneous tracking and mapping. Our group's previous work involved visual odometry using a stereo camera or Microsoft Kinect sensor, fused with the IMU, with all processing is done onboard the vehicle.

State-Estimation For Aggressive Flight

We have developed a system that enables a fixed wing vehicle to localize in a known map, and fly autonomously using only a 2D laser scanner fused with an IMU. All processing is done onboard the vehicle.

Laser Scan Matching For GPS-Denied Flight

We have developed a flight system based around position estimation using laser scan matching to localize the vehicle.