
INTERACTIVE CHANGE DETECTION USING HIGH RESOLUTION REMOTE SENSING IMAGES BASED ON ACTIVE LEARNING WITH GAUSSIAN PROCESSES
Author(s) -
Hui Ren,
Huai Yu,
Pingping Huang,
Wen Ye
Publication year - 2016
Publication title -
isprs annals of the photogrammetry, remote sensing and spatial information sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.356
H-Index - 38
eISSN - 2194-9042
pISSN - 2196-6346
DOI - 10.5194/isprsannals-iii-7-141-2016
Subject(s) - change detection , computer science , remote sensing , land cover , economic shortage , artificial intelligence , set (abstract data type) , gaussian , pattern recognition (psychology) , computer vision , land use , geography , linguistics , philosophy , civil engineering , physics , quantum mechanics , government (linguistics) , engineering , programming language
Although there have been many studies for change detection, the effective and efficient use of high resolution remote sensing images is still a problem. Conventional supervised methods need lots of annotations to classify the land cover categories and detect their changes. Besides, the training set in supervised methods often has lots of redundant samples without any essential information. In this study, we present a method for interactive change detection using high resolution remote sensing images with active learning to overcome the shortages of existing remote sensing image change detection techniques. In our method, there is no annotation of actual land cover category at the beginning. First, we find a certain number of the most representative objects in unsupervised way. Then, we can detect the change areas from multi-temporal high resolution remote sensing images by active learning with Gaussian processes in an interactive way gradually until the detection results do not change notably. The artificial labelling can be reduced substantially, and a desirable detection result can be obtained in a few iterations. The experiments on Geo-Eye1 and WorldView2 remote sensing images demonstrate the effectiveness and efficiency of our proposed method.