z-logo
open-access-imgOpen Access
A Fast Algorithm to Estimate the Deepest Points of Lakes for Regional Lake Registration
Author(s) -
Zhanfeng Shen,
Xinju Yu,
Yongwei Sheng,
Junli Li,
Jiancheng Luo
Publication year - 2015
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0144700
Subject(s) - polygon (computer graphics) , computer science , algorithm , process (computing) , point (geometry) , voronoi diagram , matching (statistics) , geospatial analysis , support vector machine , geology , artificial intelligence , remote sensing , mathematics , geometry , statistics , telecommunications , frame (networking) , operating system
When conducting image registration in the U.S. state of Alaska, it is very difficult to locate satisfactory ground control points because ice, snow, and lakes cover much of the ground. However, GCPs can be located by seeking stable points from the extracted lake data. This paper defines a process to estimate the deepest points of lakes as the most stable ground control points for registration. We estimate the deepest point of a lake by computing the center point of the largest inner circle (LIC) of the polygon representing the lake. An LIC-seeking method based on Voronoi diagrams is proposed, and an algorithm based on medial axis simplification (MAS) is introduced. The proposed design also incorporates parallel data computing. A key issue of selecting a policy for partitioning vector data is carefully studied, the selected policy that equalize the algorithm complexity is proved the most optimized policy for vector parallel processing. Using several experimental applications, we conclude that the presented approach accurately estimates the deepest points in Alaskan lakes; furthermore, we gain perfect efficiency using MAS and a policy of algorithm complexity equalization.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here