z-logo
Premium
Planetary Monocular Simultaneous Localization and Mapping
Author(s) -
Bajpai Abhinav,
Burroughes Guy,
Shaukat Affan,
Gao Yang
Publication year - 2016
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.21608
Subject(s) - simultaneous localization and mapping , artificial intelligence , computer vision , computer science , robustness (evolution) , visual odometry , monocular , kalman filter , modular design , extended kalman filter , feature (linguistics) , robot , mobile robot , biochemistry , chemistry , linguistics , philosophy , gene , operating system
Planetary monocular simultaneous localization and mapping (PM‐SLAM), a modular, monocular SLAM system for use in planetary exploration, is presented. The approach incorporates a biologically inspired visual saliency model (i.e., semantic feature detection) for visual perception in order to improve robustness in the challenging operating environment of planetary exploration. A novel method of generating hybrid‐salient features, using point‐based descriptors to track the products of the visual saliency models, is introduced. The tracked features are used for rover and map state‐estimation using a SLAM filter, resulting in a system suitable for use in long‐distance autonomous (micro)rover navigation, and the inherent hardware constraints of planetary rovers. Monocular images are used as an input to the system, as a major motivation is to reduce system complexity and optimize for microrover platforms. This paper sets out the various components of the modular SLAM system and then assesses their comparative performance using simulated data from the Planetary and Asteroid Natural Scene Generation Utility (PANGU), as well as real‐world datasets from the West Wittering field trials (performed by the STAR Lab) and the SEEKER field trials in Chile (performed by the European Space Agency). The system as a whole was shown to perform reliably, with the best performance observed using a combination of Hou‐saliency and speeded‐up robust features (SURF) descriptors with an extended Kalman filter, which performed with higher accuracy than a state‐of‐the‐art, independently optimized visual odometry localization system on a challenging real‐world dataset.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here