z-logo
open-access-imgOpen Access
Exploring the Role of Depth Information from DAM in Diverse Remote Sensing Semantic Segmentation Tasks
Author(s) -
Dehao Zhou,
Fukun Bi,
Xinghai Hou,
Xianping Ma,
Zhiliu Yang,
Binbin Yang
Publication year - 2025
Publication title -
ieee journal of selected topics in applied earth observations and remote sensing
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.246
H-Index - 88
eISSN - 2151-1535
pISSN - 1939-1404
DOI - 10.1109/jstars.2025.3620999
Subject(s) - geoscience , signal processing and analysis , power, energy and industry applications
Existing semantic segmentation methods for remote sensing images focus mainly on planar features to boost performance but inadequately consider the potential advantages of incorporating depth features. To address this issue, we integrate monocular depth estimations generated by the Depth Anything Model (DAM) and propose three methods that progressively deepen the utilization of depth information in Fully Supervised, Semi-Supervised, and Unsupervised Domain Adaptation tasks. Specifically, for Fully Supervised tasks, Depth-Aware Edge Consistency Loss is applied to enhance boundary positioning and mitigate the adverse effects of intra-class depth variations; For Semi-Supervised tasks, Depth-Guided Weight Update, guided by Mixture of Experts (MoE), injects geometric cues and abstract semantics from depth into unlabeled data; And for Unsupervised Domain Adaptation tasks, a Depth-Enhanced Adversarial Feature Learning Strategy facilitates stereo feature alignment between source and target domains to improve domain adaptation effectiveness. Extensive experiments on various remote sensing datasets demonstrate the effectiveness and generality of the proposed methods, which require no modifications to the model architecture and can be readily adapted to similar tasks, yielding average mIoU gains of 0.93%, 1.47%, and 1.29% on fully supervised, semi-supervised, and unsupervised domain adaptation tasks, respectively. Our code is available at https://github.com/dehaozhou/Explore_Depth .

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom