
Active Extrinsic Calibration of Encoder and LIDAR for Robot
Author(s) -
Yi Tong,
Bin Lan,
Xueqian Wang,
Bin Liang
Publication year - 2022
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2188/1/012003
Subject(s) - calibration , encoder , computer science , lidar , robot , trajectory , process (computing) , computer vision , robot calibration , artificial intelligence , transformation (genetics) , sensor fusion , coordinate system , domain (mathematical analysis) , mobile robot , robot kinematics , remote sensing , mathematics , geography , physics , biochemistry , mathematical analysis , statistics , chemistry , astronomy , gene , operating system
Robotic extrinsic calibration is a prerequisite for multi-sensor data fusion, which provides a transformation between coordinate systems of different sensors. For wheeled robots equipped with encoder and other sensors, the existing usual calibration process is completed by professional calibration engineers, or just randomly, which is not efficient and may cause motion degradation or unknown uncertainty. In this paper, we propose an active extrinsic calibration algorithm for encoder and 2D LIDAR. The trajectory in the calibration process is planned in belief space, which is optimized to get the calibration value with the minimized uncertainty. Our method doesn’t rely on maximum-likelihood observations assumption, and can also work in discontinuous sensing domain. The simulation is conducted to validate the method.