Premium
Long‐range rover localization by matching LIDAR scans to orbital elevation maps
Author(s) -
Carle Patrick J.F.,
Furgale Paul T.,
Barfoot Timothy D.
Publication year - 2010
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.20336
Subject(s) - odometry , computer vision , artificial intelligence , visual odometry , elevation (ballistics) , lidar , orientation (vector space) , computer science , remote sensing , terrain , matching (statistics) , range (aeronautics) , simultaneous localization and mapping , mars exploration program , traverse , geology , geodesy , geography , mobile robot , robot , mathematics , engineering , statistics , physics , geometry , cartography , astronomy , aerospace engineering
Abstract Current rover localization techniques such as visual odometry have proven to be very effective on short‐ to medium‐length traverses (e.g., up to a few kilometers). This paper deals with the problem of long‐range rover localization (e.g., 10 km and up) by developing an algorithm named MOGA (Multi‐frame Odometry‐compensated Global Alignment). This algorithm is designed to globally localize a rover by matching features detected from a three‐dimensional (3D) orbital elevation map to features from rover‐based, 3D LIDAR scans. The accuracy and efficiency of MOGA are enhanced with visual odometry and inclinometer/sun‐sensor orientation measurements. The methodology was tested with real data, including 37 LIDAR scans of terrain from a Mars–Moon analog site on Devon Island, Nunavut. When a scan contained a sufficient number of good topographic features, localization produced position errors of no more than 100 m, of which most were less than 50 m and some even as low as a few meters. Results were compared to and shown to outperform VIPER, a competing global localization algorithm that was given the same initial conditions as MOGA. On a 10‐km traverse, MOGA's localization estimates were shown to significantly outperform visual odometry estimates. This paper shows how the developed algorithm can be used to accurately and autonomously localize a rover over long‐range traverses. © 2010 Wiley Periodicals, Inc.