z-logo
open-access-imgOpen Access
A novel Spectral-Spatial Attention Network for Zero-Shot Pansharpening
Author(s) -
Hailiang Lu,
Mercedes E. Paoletti,
Juan M. Haut,
Sergio Moreno-Alvarez,
Guangsheng Chen,
Weipeng Jing
Publication year - 2025
Publication title -
ieee journal of selected topics in applied earth observations and remote sensing
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.246
H-Index - 88
eISSN - 2151-1535
pISSN - 1939-1404
DOI - 10.1109/jstars.2025.3587244
Subject(s) - geoscience , signal processing and analysis , power, energy and industry applications
Over the past decades, pansharpening technologies have received much attention due to the spatial detail enhancement they introduce into multispectral (MS) images by referencing them to panchromatic images (PAN). Traditional approaches struggle to capture nonlinear relationships between the original MS-PAN pair and the pansharpened MS, whilst deep learning (DL) based methods introduce new challenges. Unsupervised pansharpening is prone to distortion due to the unavailability of reference images and errors in modeling the degradation process. Supervised methods are trained at reduced resolutions before migrating to full resolution, which lead to undesirable results due to scale variations. Meanwhile, the constructed training data cannot encompass all possible scenes, hindering the reconstruction capability of unknown scenes. To tackle these challenges, we propose a novel zero-shot Pansharpening Network ( ZSPNet ), which exclusively conducts training and testing on the target image pair, ensuring the robust performance in unknown scenes. Furthermore, ZSPNet method can effectively reduce the scale variance due to its tailored network using an appropriate patch size. As a result, by integrating 3D convolutional neural networks (3DCNN), spatial attention and channel attention, ZSPNet is capable of accurately reconstructing MS with enhanced spatial resolution. The experiments were conducted on the public dataset PAirMax, which have challenging scenes captured by different sensors. Compared to some state-of-the-art traditional and DL-based methods, ZSPNet demonstrates superior performance in both quantitative assessments and visual results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom