z-logo
open-access-imgOpen Access
Ship Classification by the Fusion of Panchromatic Image and Multi-spectral Image Based on Pseudo Siamese LightweightNetwork
Author(s) -
Mengyang Li,
Weiwei Sun,
Xuan Du,
Xiaohan Zhang,
Libo Yao
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1757/1/012022
Subject(s) - panchromatic film , multispectral image , computer science , artificial intelligence , feature extraction , feature (linguistics) , image fusion , image (mathematics) , pattern recognition (psychology) , computer vision , convolutional neural network , feature detection (computer vision) , channel (broadcasting) , dimension (graph theory) , contextual image classification , fusion , remote sensing , image processing , geography , telecommunications , mathematics , philosophy , linguistics , pure mathematics
The current rapid development of the remote sensing satellite industry provides a large amount of image data for ship classification tasks. Aiming at the problem of insufficient feature extraction of single source image, this paper designs a lightweight ship classification model based on the fusion of panchromatic image and multispectral image of pseudo Siamese network to extract image features more fully. First, establish a multi-source remote sensing image ship target classification dataset MPFS (MS and PAN Ship image Fusion Classification Dataset); secondly, send panchromatic images and multispectral images to the network through different convolutional layers, thendesign a multi-level feature extraction network for panchromatic images and an adaptive feature extraction network for spectral imagesrespectively; finally, concatenate the features along the channel dimension and send them to the classification network.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here