Premium
Autonomous Visual Mapping and Exploration With a Micro Aerial Vehicle
Author(s) -
Heng Lionel,
Honegger Dominik,
Lee Gim Hee,
Meier Lorenz,
Tanskanen Petri,
Fraundorfer Friedrich,
Pollefeys Marc
Publication year - 2014
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.21520
Subject(s) - computer vision , simultaneous localization and mapping , artificial intelligence , inertial measurement unit , computer science , payload (computing) , odometry , metric (unit) , robot , mobile robot , engineering , computer network , operations management , network packet
Cameras are a natural fit for micro aerial vehicles (MAVs) due to their low weight, low power consumption, and two‐dimensional field of view. However, computationally‐intensive algorithms are required to infer the 3D structure of the environment from 2D image data. This requirement is made more difficult with the MAV's limited payload which only allows for one CPU board. Hence, we have to design efficient algorithms for state estimation, mapping, planning, and exploration. We implement a set of algorithms on two different vision‐based MAV systems such that these algorithms enable the MAVs to map and explore unknown environments. By using both self‐built and off‐the‐shelf systems, we show that our algorithms can be used on different platforms. All algorithms necessary for autonomous mapping and exploration run on‐board the MAV. Using a front‐looking stereo camera as the main sensor, we maintain a tiled octree‐based 3D occupancy map. The MAV uses this map for local navigation and frontier‐based exploration. In addition, we use a wall‐following algorithm as an alternative exploration algorithm in open areas where frontier‐based exploration under‐performs. During the exploration, data is transmitted to the ground station which runs large‐scale visual SLAM. We estimate the MAV's state with inertial data from an IMU together with metric velocity measurements from a custom‐built optical flow sensor and pose estimates from visual odometry. We verify our approaches with experimental results, which to the best of our knowledge, demonstrate our MAVs to be the first vision‐based MAVs to autonomously explore both indoor and outdoor environments.