z-logo
Premium
Vision‐aided inertial navigation for pin‐point landing using observations of mapped landmarks
Author(s) -
Trawny Nikolas,
Mourikis Anastasios I.,
Roumeliotis Stergios I.,
Johnson Andrew E.,
Montgomery James F.
Publication year - 2007
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.20189
Subject(s) - inertial measurement unit , computer vision , computer science , artificial intelligence , inertial frame of reference , extended kalman filter , acceleration , kalman filter , spacecraft , ellipse , estimator , inertial navigation system , control theory (sociology) , engineering , aerospace engineering , mathematics , physics , statistics , geometry , classical mechanics , quantum mechanics , control (management)
In this paper we describe an extended Kalman filter algorithm for estimating the pose and velocity of a spacecraft during entry, descent, and landing. The proposed estimator combines measurements of rotational velocity and acceleration from an inertial measurement unit (IMU) with observations of a priori mapped landmarks, such as craters or other visual features, that exist on the surface of a planet. The tight coupling of inertial sensory information with visual cues results in accurate, robust state estimates available at a high bandwidth. The dimensions of the landing uncertainty ellipses achieved by the proposed algorithm are three orders of magnitude smaller than those possible when relying exclusively on IMU integration. Extensive experimental and simulation results are presented, which demonstrate the applicability of the algorithm on real‐world data and analyze the dependence of its accuracy on several system design parameters. © 2007 Wiley Periodicals, Inc.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here