Open Access
Radar style transfer for metric robot localisation on lidar maps
Author(s) -
Yin Huan,
Wang Yue,
Wu Jun,
Xiong Rong
Publication year - 2023
Publication title -
caai transactions on intelligence technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.613
H-Index - 15
ISSN - 2468-2322
DOI - 10.1049/cit2.12112
Subject(s) - lidar , computer science , computer vision , radar , artificial intelligence , remote sensing , robot , radar imaging , metric (unit) , radar engineering details , geography , engineering , telecommunications , operations management
Abstract Lidar and visual data are affected heavily in adverse weather conditions due to sensing mechanisms, which bring potential safety hazards for vehicle navigation. Radar sensing is desirable to build a more robust navigation system. In this paper, a cross‐modality radar localisation on prior lidar maps is presented. Specifically, the proposed workflow consists of two parts: first, bird's‐eye‐view radar images are transferred to fake lidar images by training a generative adversarial network offline. Then with online radar scans, a Monte Carlo localisation framework is built to track the robot pose on lidar maps. The whole online localisation system only needs a rotating radar sensor and a pre‐built global lidar map. In the experimental section, the authors conduct an ablation study on image settings and test the proposed system on Oxford Radar Robot Car Dataset. The promising results show that the proposed localisation system could track the robot pose successfully, thus demonstrating the feasibility of radar style transfer for metric robot localisation on lidar maps.