z-logo
open-access-imgOpen Access
DAN‐Conv: Depth aware non‐local convolution for LiDAR depth completion
Author(s) -
Yan Lin,
Liu Kai,
Gao Long
Publication year - 2021
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
eISSN - 1350-911X
pISSN - 0013-5194
DOI - 10.1049/ell2.12254
Subject(s) - artificial intelligence , computer science , convolution (computer science) , dilation (metric space) , depth map , computer vision , benchmark (surveying) , fuse (electrical) , pixel , image (mathematics) , lidar , pattern recognition (psychology) , algorithm , mathematics , artificial neural network , remote sensing , engineering , geodesy , combinatorics , geology , electrical engineering , geography
Sparse LiDAR depth completion is a beneficial task for many robotic applications. It commonly generates a dense depth prediction from a sparse depth map and its corresponding aligned RGB image. This image‐guided depth completion task mainly has two challenges: sparse data processing and multi‐modality data fusion. In this letter, they are dealt with by two novel solutions: (1) To efficiently process sparse depth input, a Depth Aware Non‐local Convolution (DAN‐Conv) is proposed. It augments the spatial sampling locations of a convolution operation. Specifically, DAN‐Conv constructs a non‐local neighbourhood relationship by exploiting the intrinsic correlation among observable depth pixels. In particular, it can readily replace standard convolution without introducing additional network parameters. (2) A Symmetric Co‐attention Module (SCM) is proposed to fuse and enhance features from depth and image domain. It estimates the importance of complementary features by the co‐attention mechanism. Finally, a neural network built on DAN‐Conv and SCM is proposed. It achieves competitive performance on the challenging KITTI depth completion benchmark. Comparing to approaches with approximate accuracy, this lightweight network requires significantly fewer learnable parameters.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here