z-logo
open-access-imgOpen Access
SAM-Guided Attention Maps Fusion for Region Supervised Remote Sensing Image Change Detection
Author(s) -
Yun Liu,
Zhi-Hui You,
Si-Bao Chen,
Xiao Wang,
Li-Xiang Xu,
Jin Tang,
Bin Luo
Publication year - 2025
Publication title -
ieee journal of selected topics in applied earth observations and remote sensing
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.246
H-Index - 88
eISSN - 2151-1535
pISSN - 1939-1404
DOI - 10.1109/jstars.2025.3610503
Subject(s) - geoscience , signal processing and analysis , power, energy and industry applications
In practical applications, pixel-level annotation samples are both time-consuming and labor-intensive, posing a major challenge for large-scale applications. In this paper, we propose a segment anything model (SAM) guided attention maps fusion network for region supervised change detection (CD), namely SGAMFNet, which integrates a pre-trained foundation model SAM and aware attention map. Before training, SAM is utilized to generate segmentation maps based on region labels, initially separating the foreground and background within the region, which can avoid the high computational costs associated with directly integrating foundation models. Then, a foreground-background separation module (FBSM) is designed to further optimize the separation of foreground and background, generating higher quality pseudo labels. Specifically, it constructs a background attention map by retrieving background prototypes to suppress interference from unchanged pixels, and generates a foreground attention map by normalizing the difference features to highlight changed regions. The experimental results on the LEVIR, BCDD, and S2Looking datasets demonstrate that SGAMFNet performs well, effectively avoiding false detections caused by the mixing of foreground and background within region-level labels, thereby improving detection accuracy. This approach provides a novel perspective on region supervised CD and highlights the potential of foundation models in weakly supervised remote sensing image analysis. The code is available at https://github.com/yunL719/SGAMFNet .

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom