
Augmenting ViSP’s 3D Model-Based Tracker with RGB-D SLAM for 3D Pose Estimation in Indoor Environments
Author(s) -
J. Li-Chee-Ming,
C. Armenakis
Publication year - 2016
Publication title -
the international archives of the photogrammetry, remote sensing and spatial information sciences/international archives of the photogrammetry, remote sensing and spatial information sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.264
H-Index - 71
eISSN - 1682-1777
pISSN - 1682-1750
DOI - 10.5194/isprsarchives-xli-b1-925-2016
Subject(s) - computer vision , artificial intelligence , simultaneous localization and mapping , pose , computer science , tracking (education) , rgb color model , feature (linguistics) , process (computing) , trajectory , mobile robot , robot , psychology , pedagogy , linguistics , philosophy , physics , astronomy , operating system
This paper presents a novel application of the Visual Servoing Platform’s (ViSP) for pose estimation in indoor and GPS-denied outdoor environments. Our proposed solution integrates the trajectory solution from RGBD-SLAM into ViSP’s pose estimation process. Li-Chee-Ming and Armenakis (2015) explored the application of ViSP in mapping large outdoor environments, and tracking larger objects (i.e., building models). Their experiments revealed that tracking was often lost due to a lack of model features in the camera’s field of view, and also because of rapid camera motion. Further, the pose estimate was often biased due to incorrect feature matches. This work proposes a solution to improve ViSP’s pose estimation performance, aiming specifically to reduce the frequency of tracking losses and reduce the biases present in the pose estimate. This paper explores the integration of ViSP with RGB-D SLAM. We discuss the performance of the combined tracker in mapping indoor environments and tracking 3D wireframe indoor building models, and present preliminary results from our experiments.