
A Lightweight Change Detection Network based on Feature Interleaved Fusion and Bi-stage Decoding
Author(s) -
Mengmeng Wang,
Bai Zhu,
Jiacheng Zhang,
Jianwei Fan,
Yuanxin Ye
Publication year - 2023
Publication title -
ieee journal of selected topics in applied earth observations and remote sensing
Language(s) - English
Resource type - Journals
eISSN - 2151-1535
pISSN - 1939-1404
DOI - 10.1109/jstars.2023.3344635
Subject(s) - geoscience , signal processing and analysis , power, energy and industry applications
Deep learning (DL) techniques for change detection have undergone rapid development in the last few years. However, it is still a challenge how to reduce massive network parameters and sufficiently fuse bi-temporal image features to improve detection accuracy. Therefore, this work proposes a novel and lightweight network based on feature interleaved fusion and bi-stage decoding (FFBDNet) for change detection. In the encoding stage, considering the application problems caused by a large number of network parameters, we use the more efficient EfficientNet as the backbone to extract the bi-temporal image features based on Siamese architecture. To fuse the bi-temporal image features and reduce interference from surrounding objects, we propose a feature interleaved fusion module (FIFM), which can interleave the shared feature information and the difference variance feature information. During the decoding stage, the fused features are split into two groups, and a novel bi-stage decoding framework is proposed to generate the accuracy change map gradually. Extensive experiments and ablation studies are validated on three public change detection datasets: the WHU-CD, LEVIR-CD, and SYSU-CD datasets. Compared to state-of-the-art (SOTA) methods, the experimental results demonstrate that the proposed FFBDNet produces a better balance between performance and model parameters. Specifically, the F1 values obtained for these three datasets are 93.27%, 91.11%, and 80.10%, respectively, and the model parameters of the network are just 2.85 M.