z-logo
Premium
City object detection from airborne Lidar data with OpenStreetMap‐tagged superpixels
Author(s) -
Mao Bo,
Li Bingchan
Publication year - 2020
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.6026
Subject(s) - lidar , computer science , artificial intelligence , segmentation , object detection , graph , computer vision , remote sensing , pattern recognition (psychology) , object (grammar) , geography , theoretical computer science
Summary Lidar‐based city objects detection is an interesting topic along with the development of Laser scan equipment which has been widely applied in various applications such as 3D building reconstruction, navigation, and so on. In this article, we describe a city object detection algorithm for airborne Lidar images using superpixel segmentation and DenseNet classification. Compared with the existing studies, this article has two innovations. First, a DenseNet‐based city object classification model is trained by data sets automatically labeled from the OpenStreetMap. Second, the graph analysis is applied to further improve the classification of the superpixels. The results from an experiment in the London area indicate that the DenseNet‐based classification model trained by OpenStreetMap data can achieve 86% classification accuracy for building objects. With the proposed graph analysis, the detection accuracy of building objects increased to 98.5% in the test areas. Also, we testified that by dividing the city area into different types such as commercial, residential, and rural, the detection accuracy can be further improved. Based on the extensive examinations, it is suggested that the proposed superpixel classification method can be used to detect city objects from large‐scale low‐resolution Lidar image data (50 cm).

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here