
Assessment of stereo camera calibration techniques for a portable mobile mapping system
Author(s) -
Brogan Michael,
McLoughlin Simon,
Deegan Catherine
Publication year - 2013
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2012.0085
Subject(s) - computer vision , artificial intelligence , calibration , computer science , mobile mapping , camera resectioning , stereo camera , camera auto calibration , object (grammar) , computer graphics (images) , mathematics , point cloud , statistics
Mobile mapping systems that detect and geo‐reference road markings almost always consist of a stereo camera system integrated with a global positioning system/inertial navigation system. The data acquired by this navigational system allows features detected in the stereo images to be assigned global co‐ordinates. An essential step in this process is the calibration of the cameras, as it relates the pose of the two cameras to each other and a world co‐ordinate system. In Europe, road markings must be evaluated from a 35 m range, so the cameras are required to have a wide field of view. Traditional calibration methods supposedly require a calibration object that would fill most of the calibration images. This large field of view would require a calibration object of substantial size that would be impractical for the purposes of this portable system. This study explores the theory of camera calibration and then details two camera calibration techniques (using portable 3D and 2D calibration objects). The accuracy of these methods is then evaluated using a ground‐truth experiment.