z-logo
Premium
A deep learning approach for rooftop geocoding
Author(s) -
Yin Zhengcong,
Ma Andong,
Goldberg Daniel W.
Publication year - 2019
Publication title -
transactions in gis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.721
H-Index - 63
eISSN - 1467-9671
pISSN - 1361-1682
DOI - 10.1111/tgis.12536
Subject(s) - geocoding , computer science , interpolation (computer graphics) , object (grammar) , set (abstract data type) , data mining , workflow , geographic information system , spatial analysis , artificial intelligence , geography , cartography , remote sensing , database , motion (physics) , programming language
Geocoding has become a routine task for many research investigations to conduct spatial analysis. However, the output quality of geocoding systems is found to impact the conclusions of subsequent studies that employ this workflow. The published development of geocoding systems has been limited to the same set of interpolation methods and reference data sets for quite some time. We introduce a novel geocoding approach utilizing object detection on remotely sensed imagery based on a deep learning framework to generate rooftop geocoding output. This allows geocoding systems to use and output exact building locations without employing typical geocoding interpolation methods or being completely limited by the availability of reference data sets. The utility of the proposed approach is demonstrated over a sample of 22,481 addresses resulting in significant spatial error reduction and match rates comparable to typical geocoding methods. For different land‐use types, our approach performs better on low‐density residential and commercial addresses than on high‐density residential addresses. With appropriate model setup and training, the proposed approach can be extended to search different object locations and to generate new address and point‐of‐interest reference data sets.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here