z-logo
open-access-imgOpen Access
Combination of modified U‐Net and domain adaptation for road detection
Author(s) -
Dong Ming,
Zhao Xiangmo,
Fan Xing,
Shen Chao,
Liu Zhanwen
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.6696
Subject(s) - domain adaptation , net (polyhedron) , adaptation (eye) , computer science , domain (mathematical analysis) , artificial intelligence , mathematics , optics , physics , mathematical analysis , geometry , classifier (uml)
Road detection is one of the crucial tasks for scene understanding in autonomous driving. Recently, methods based on deep learning had rapidly grown and addressed this task excellently, because they can extract more abundant features. In this study, the authors consider the visual road detection problem as a classification for each pixel of the given image, which is road or non‐road. There is complex illumination encounter in traffic applications, so that the detection model has poor adaptability. They address this problem by proposing a deep network architecture, which combines the network U‐Net‐prior and domain adaptation model (DAM). U‐Net‐prior is a modified segmentation network which integrates location prior and shape prior into U‐Net. DAM is a model for reducing the gap between training images and test images, which is optimised in adversarial learning to make the features extracted from different datasets close to each other. They validate the effectiveness of each component of the algorithm, and compare the overall architecture with other state‐of‐the‐art methods, and the results show that the architecture achieves top accuracies with the shortest run time in monocular‐vision‐based methods, simultaneously, compared with the methods based on other sensors, the architecture also achieves a competitive result.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here