z-logo
open-access-imgOpen Access
Parallel Cascade R-CNN for object detection in remote sensing imagery
Author(s) -
Jingyou Hou,
Hongbing Ma,
Shengjin Wang
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1544/1/012124
Subject(s) - computer science , preprocessor , object detection , artificial intelligence , cascade , minimum bounding box , focus (optics) , computer vision , pattern recognition (psychology) , orientation (vector space) , image (mathematics) , mathematics , chemistry , physics , geometry , chromatography , optics
Object detection in remote sensing image is a challenging task in computer vision. Remote sensing images have different characteristics compared with conventional images. Especially object detection in remote sensing image needs to focus on small targets with different ratio and orientation. In this paper, we propose a novel detection architecture for remote sensing imagery to solve the problem of scale diversity. On the basis of Cascade R-CNN, we have developed Parallel Cascade R-CNN. In the second stage, parallel detection heads are used for separate detection, and their RoIAlign modules have different output sizes. In addition, different preprocessing methods are applied according to the shape and quantity characteristics of different classes. We evaluated our algorithm on DOTA dataset. Experiments have shown that our algorithm can achieve performance improvement on a high baseline. And the detection performance of those categories with smaller objects has been improved. In the detection task of horizontal bounding box, we obtained the mAP of 78.96%, which reached the state of the art. Our algorithm is simple and has good performance, easy to be migrated to various network structures.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here