z-logo
open-access-imgOpen Access
Multi‐lane detection based on omnidirectional camera using anisotropic steerable filters
Author(s) -
Li Chuanxiang,
Dai Bin,
Wang Ruili,
Fang Yuqiang,
Yuan Xingsheng,
Wu Tao
Publication year - 2016
Publication title -
iet intelligent transport systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.579
H-Index - 45
eISSN - 1751-9578
pISSN - 1751-956X
DOI - 10.1049/iet-its.2015.0144
Subject(s) - computer vision , artificial intelligence , computer science , feature (linguistics) , intersection (aeronautics) , filter (signal processing) , omnidirectional antenna , engineering , telecommunications , philosophy , linguistics , antenna (radio) , aerospace engineering
Automated lane detection is a vital part of driver assistance systems in intelligent vehicles. In this study, a multi‐lane detection method based on omnidirectional images is presented to conquer the difficulties stemming from the limited view field of the rectilinear cameras. The contributions of this study are twofold. First, to extract the features of the lane markings under various illumination and road‐surface scenarios, a feature extractor based on anisotropic steerable filter is proposed. Second, a parabola lane model is used to fit the straight as well as curved lanes. According to the parabola lane model, the straight lines and curves of feature maps can be represented as straight lines in a linear space coordinate system. Then lane modelling can be treated as an optimisation question in linear space and the parameters of lanes can be estimated by minimising the objection function. The method has been tested on publicly available data sets and the real car experiments. Experimental results show that the proposed method outperforms state‐of‐the‐arts approaches and obtains a detection accuracy of 99% in real world scenes.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here