
U‐FPNDet: A one‐shot traffic object detector based on U‐shaped feature pyramid module
Author(s) -
Ke Xiao,
Li Jianping
Publication year - 2021
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/ipr2.12182
Subject(s) - pyramid (geometry) , computer science , artificial intelligence , pooling , object detection , feature (linguistics) , detector , computer vision , context (archaeology) , pedestrian detection , bottleneck , pixel , feature extraction , cascade , pattern recognition (psychology) , pedestrian , engineering , mathematics , telecommunications , embedded system , linguistics , philosophy , paleontology , geometry , chemical engineering , transport engineering , biology
In the field of automatic driving, identifying vehicles and pedestrians is the starting point of other automatic driving techniques. Using the information collected by the camera to detect traffic targets is particularly important. The main bottleneck of traffic object detection is due to the same category of targets, which may have different scales. For example, the pixel‐level of cars may range from 30 to 300 px, which will cause instability of positioning and classification. In this paper, a multi‐dimension feature pyramid is constructed in order to solve the multi‐scale problem. The feature pyramid is built by developing a U‐shaped module and using a cascade‐method. In order to verify the effectiveness of the U‐shaped module, we also designed a new one‐shot detector U‐FPNDet. The model first extracts the basic feature map by using the basic network and constructs the multi‐dimension feature pyramid. Next, a pyramid pooling module is used to get more context information from the scene. Finally, the detection network is run on each level of the pyramid to obtain the final result by NMS. By using this method, a state‐of‐the‐art performance is achieved on both detection and classification on commonly used benchmarks.