
Visual SLAM Framework Based on Segmentation with the Improvement of Loop Closure Detection in Dynamic Environments
Author(s) -
Liping Sun,
Rohan P. Singh,
Fumio Kanehiro,
AUTHOR_ID,
AUTHOR_ID,
AUTHOR_ID
Publication year - 2021
Publication title -
journal of robotics and mechatronics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.257
H-Index - 19
eISSN - 1883-8049
pISSN - 0915-3942
DOI - 10.20965/jrm.2021.p1385
Subject(s) - artificial intelligence , simultaneous localization and mapping , computer vision , computer science , robustness (evolution) , segmentation , odometry , kalman filter , extended kalman filter , robotics , point cloud , object detection , robot , mobile robot , biochemistry , chemistry , gene
Most simultaneous localization and mapping (SLAM) systems assume that SLAM is conducted in a static environment. When SLAM is used in dynamic environments, the accuracy of each part of the SLAM system is adversely affected. We term this problem as dynamic SLAM. In this study, we propose solutions for three main problems in dynamic SLAM: camera tracking, three-dimensional map reconstruction, and loop closure detection. We propose to employ geometry-based method, deep learning-based method, and the combination of them for object segmentation. Using the information from segmentation to generate the mask, we filter the keypoints that lead to errors in visual odometry and features extracted by the CNN from dynamic areas to improve the performance of loop closure detection. Then, we validate our proposed loop closure detection method using the precision-recall curve and also confirm the framework’s performance using multiple datasets. The absolute trajectory error and relative pose error are used as metrics to evaluate the accuracy of the proposed SLAM framework in comparison with state-of-the-art methods. The findings of this study can potentially improve the robustness of SLAM technology in situations where mobile robots work together with humans, while the object-based point cloud byproduct has potential for other robotics tasks.