Premium
Performance evaluation of 1‐point‐RANSAC visual odometry
Author(s) -
Scaramuzza Davide
Publication year - 2011
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.20411
Subject(s) - ransac , visual odometry , artificial intelligence , computer vision , outlier , computer science , motion estimation , structure from motion , odometry , mathematics , algorithm , image (mathematics) , robot , mobile robot
Monocular visual odometry is the process of computing the egomotion of a vehicle purely from images of a single camera. This process involves extracting salient points from consecutive image pairs, matching them, and computing the motion using standard algorithms. This paper analyzes one of the most important steps toward accurate motion computation, which is outlier removal. The random sample consensus (RANSAC) has been established as the standard method for model estimation in the presence of outliers. RANSAC is an iterative method, and the number of iterations necessary to find a correct solution is exponential in the minimum number of data points needed to estimate the model. It is therefore of utmost importance to find the minimal parameterization of the model to estimate. For unconstrained motion [six degrees of freedom (DoF)] of a calibrated camera, this would be five correspondences. In the case of planar motion, the motion model complexity is reduced (three DoF) and can be parameterized with two points. In this paper we show that when the camera is installed on a nonholonomic wheeled vehicle, the model complexity reduces to two DoF and therefore the motion can be parameterized with a single‐point correspondence. Using a single‐feature correspondence for motion estimation is the lowest model parameterization possible and results in the most efficient algorithm for removing outliers, which we call 1‐point RANSAC. To support our method, we run many experiments on both synthetic and real data and compare the performance with state‐of‐the‐art approaches and with different vehicles, both indoors and outdoors. © 2011 Wiley Periodicals, Inc.