z-logo
open-access-imgOpen Access
Airborne infrared aircraft target detection algorithm based on YOLOv4-tiny
Author(s) -
Xiangguan Hou,
Jiayi Ma,
Shaofei Zang
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1865/4/042007
Subject(s) - feature (linguistics) , computer science , convolution (computer science) , artificial intelligence , pattern recognition (psychology) , object detection , feature extraction , algorithm , frame (networking) , block (permutation group theory) , computer vision , mathematics , artificial neural network , telecommunications , philosophy , linguistics , geometry
Aiming at the problem that the current aerial infrared aircraft targets are blurred, low contrast, and susceptible to noise interference, which can lead to inaccurate recognition, an improved YOLOv4-tiny infrared aircraft target detection method based on cavity convolution is proposed. First, in order to make full use of the shallow features, add a parallel branch after the feature layer whose output size of YOLOv4-tiny is; then, after the first network output layer, add three parallel holes of 1, 3 respectively. 5. The depth of 5 can separate the hollow convolution layer to expand the feature map receptive field; finally, the feature fusion network is improved, the final output feature layer of the network, the improved anti-residual block extraction feature, and the size is adjusted after convolution. The prediction result is processed and output by yolo head. Experiments on the aerial infrared aircraft data set show that compared with the original YOLOv4-tiny, the detection accuracy is increased by 4.29% and the detection effect is significantly improved under the premise of less loss of detection frame rate and small weight.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here