z-logo
open-access-imgOpen Access
Ship Classification from SAR Images based on Sequence Input of Deep Neural Network
Author(s) -
Huaiyu Zhu,
Nan Lin,
David W. M. Leung
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1549/5/052042
Subject(s) - computer science , artificial intelligence , convolutional neural network , synthetic aperture radar , pattern recognition (psychology) , artificial neural network , sequence (biology) , deep learning , contextual image classification , frame (networking) , set (abstract data type) , image (mathematics) , telecommunications , genetics , biology , programming language
This article proposes an all brand new classification architecture for SAR images of the ship via deep learning. Compared with the most widely used conventional classification algorithm based on deep neural networks, we set a sequence of SAR images as input to the neural network, which is different from a single image as input in those kinds of conventional ones. Two central neural networks form the construction of this new classification architecture; the first is a convolutional neural network used to extract the features of each frame, and the second is the LSTM part, which is trained on the sequence to predict the labels. It is much more significant to enhance the connection among the SAR images primarily due to the limit of SAR data, which makes us combine all the SAR images to several sequences. The proposed new classification architecture is trained on the OpenSARShip[2] dataset captured by Sentinel-1 space-borne radar. Experiment results show that the classification architecture can get an accuracy of 99.24% in classifying the six kinds of targets, which is much more precise than any conventional methods before.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here