z-logo
Premium
Consistent decentralized cooperative localization for autonomous vehicles using LiDAR, GNSS, and HD maps
Author(s) -
Héry Elwan,
Xu Philippe,
Bonnifait Philippe
Publication year - 2021
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.22004
Subject(s) - gnss applications , computer science , lidar , consistency (knowledge bases) , estimator , ranging , set (abstract data type) , metric (unit) , kinematics , sensor fusion , global positioning system , bounded function , position (finance) , artificial intelligence , satellite system , computer vision , real time computing , mathematics , remote sensing , geography , engineering , telecommunications , statistics , operations management , physics , finance , classical mechanics , economics , programming language , mathematical analysis
To navigate autonomously, a vehicle must be able to localize itself with respect to its driving environment and the vehicles with which it interacts. This study presents a decentralized cooperative localization method. It is based on the exchange of local dynamic maps (LDM), which are cyber‐physical representations of the physical driving environment containing poses and kinematic information about nearby vehicles. An LDM acts as an abstraction layer that makes the cooperation framework sensor‐agnostic, and it can even improve the localization of a sensorless communicating vehicle. With this goal in mind, this study focuses on the property of consistency in LDM estimates. Uncertainty in the estimates needs to be properly modeled, so that the estimation error can be statistically bounded for a given confidence level. To obtain a consistent system, we first introduce a decentralized fusion framework that can cope with LDMs whose errors have an unknown degree of correlation. Second, we present a consistent method for estimating the relative pose between vehicles, using a two‐dimensional LiDAR (light detection and ranging) with a point‐to‐line metric within an iterative‐closest‐point approach, combined with communicated polygonal shape models. Finally, we add a bias estimator to reduce position errors when nondifferential GNSS (global navigation satellite system) receivers are used, based on visual observations of features geo‐referenced in a high‐definition map. Real experiments were conducted, and the consistency of our approach was demonstrated on a platooning scenario using two experimental vehicles. The full experimental data set used in this study is publicly available.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here