
Class-aware Self-distillation for Remote Sensing Image Scene Classification
Author(s) -
Bin Wu,
Siyuan Hao,
Wei Wang
Publication year - 2023
Publication title -
ieee journal of selected topics in applied earth observations and remote sensing
Language(s) - English
Resource type - Journals
eISSN - 2151-1535
pISSN - 1939-1404
DOI - 10.1109/jstars.2023.3343521
Subject(s) - geoscience , signal processing and analysis , power, energy and industry applications
Currently, Convolutional Neural Networks (CNNs) and Vision Transformers (ViTs) are widely adopted as the predominant neural network architectures for remote sensing image scene classification. Although CNNs have lower computational complexity, ViTs have a higher performance ceiling, making both suitable as backbone networks for remote sensing scene classification tasks. However, remote sensing imagery has high intra-class variation and inter-class similarity, which poses a challenge for existing methods. To address this issue, we propose the Class-aware Self-distillation (CASD) Framework. This framework uses an end-to-end distillation mechanism to mine class-aware knowledge, effectively reducing the impact of significant intra-class variation and inter-class similarity in remote sensing imagery. Specifically, our approach involves constructing pairs of images: similar pairs consisting of images belonging to the same class, and dissimilar pairs consisting of images from different classes. We then apply a distillation loss that we designed, which distills the corresponding probability distributions to ensure that the distributions of similar pairs become more consistent, and those of dissimilar pairs become more distinct. Additionally, the enforced learnable α added to the distillation loss further amplifies the network's ability to comprehend class-aware knowledge. The experiment section demonstrates that our method CASD outperforms other methods on four publicly available datasets. And the ablation experiments demonstrate the effectiveness of the method.