z-logo
open-access-imgOpen Access
An End-to-end Auto-driving Method Based on 3D Lidar
Author(s) -
Meng Wang,
Han Dong,
Wei Zhang,
Wei Shu,
Chao Chen,
Yuanzhi Lü,
Hongfei Li
Publication year - 2019
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1288/1/012061
Subject(s) - lidar , end to end principle , computer science , point cloud , deep learning , artificial intelligence , artificial neural network , advanced driver assistance systems , convolutional neural network , computer vision , transformation (genetics) , real time computing , remote sensing , geography , biochemistry , chemistry , gene
The development of artificial intelligence, especially deep learning engineering technology, has made auto-driving cars more and more realistic. The end-to-end auto-driving method is an automatic driving system which is different from the rule-based system of. It uses the data from the environment to output vehicle control information solutions directly, greatly reducing the system complexity. 3D Lidar is the core sensor of automatic driving system. In this paper, a deep convolution neural network is designed for the end-to-end automatic driving method. This paper uses 64-line 3D Lidar data and transformation algorithm, transforms the 3D Lidar point-cloud data into depth images which can be used directly by an end-to-end deep learning network. This paper matches 3D Lidar data with vehicle-mounted Can bus data to obtain data and tags which will be feed into the deep learning network. The output of the deep learning network is the controlling information that directly acts on the vehicle. Based on experimental verification, the end - to - end automatic driving method based on 3D Lidar is of great value and potential for further development.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here