z-logo
open-access-imgOpen Access
Multi‐scale features fusion from sparse LiDAR data and single image for depth completion
Author(s) -
Wang Benzhang,
Feng Yiliu,
Liu Hengzhu
Publication year - 2018
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
ISSN - 1350-911X
DOI - 10.1049/el.2018.6149
Subject(s) - artificial intelligence , computer science , robustness (evolution) , sparse approximation , depth map , lidar , pattern recognition (psychology) , computer vision , scale (ratio) , sparse matrix , image (mathematics) , remote sensing , biochemistry , chemistry , physics , quantum mechanics , gaussian , gene , geology
Recently deep learning‐based methods for dense depth completion from sparse depth data have shown superior performance than traditional techniques. However, sparse depth data lose the details of the scenes, for instance, the spatial and texture information. To overcome this problem, additional single image is introduced and a multi‐scale features fusion scheme to learn more correlations of the two different data is proposed. Furthermore, sparse convolution operation to improve feature robustness for sparse depth data is exploited. Experiments demonstrate that the approach obviously improves the performance for depth completion and outperforms all the previous published methods. The authors believe their works also have the guidance significance for stereo images depth estimation fused with sparse LiDAR depth data.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here