Premium
Field trial results of planetary rover visual motion estimation in Mars analogue terrain
Author(s) -
Bakambu Joseph Nsasi,
Langley Chris,
Pushpanathan Giri,
MacLean W. James,
Mukherji Raja,
Dupuis Erick
Publication year - 2012
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.21409
Subject(s) - mars exploration program , traverse , terrain , visual odometry , inertial measurement unit , computer science , artificial intelligence , odometry , computer vision , exploration of mars , simulation , remote sensing , robot , mobile robot , geodesy , geography , physics , cartography , astronomy
This paper presents the Mojave Desert field test results of planetary rover visual motion estimation (VME) developed under the “Autonomous, Intelligent, and Robust Guidance, Navigation, and Control for Planetary Rovers (AIR‐GNC)” project. Three VME schemes are compared in realistic conditions. The main innovations of this project include the use of different features from stereo‐pair images as visual landmarks and the use of vision‐based feedback to close the path‐tracking loop. The multiweek field campaign, conducted on relevant Mars analogue terrains, under dramatically changing lighting and weather conditions, shows good localization accuracy on the average. Moreover, the MDA‐developed inertial measurement unit (IMU)‐corrected odometry was reliable and had good accuracy at all test locations, including loose sand dunes. These results are based on data collected during 7.3 km of traverse, including both fully autonomous and joystick‐driven runs. © 2012 Wiley Periodicals, Inc.