
Mapping of land cover with open‐source software and ultra‐high‐resolution imagery acquired with unmanned aerial vehicles
Author(s) -
Horning Ned,
Fleishman Erica,
Ersts Peter J.,
Fogarty Frank A.,
Wohlfeil Zillig Martha
Publication year - 2020
Publication title -
remote sensing in ecology and conservation
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.191
H-Index - 21
ISSN - 2056-3485
DOI - 10.1002/rse2.144
Subject(s) - land cover , remote sensing , workflow , computer science , orthophoto , artificial intelligence , software , convolutional neural network , cartography , computer vision , land use , geography , database , engineering , civil engineering , programming language
The use of unmanned aerial vehicles ( UAV s) to map and monitor the environment has increased sharply in the last few years. Many individuals and organizations have purchased consumer‐grade UAV s, and commonly acquire aerial photographs to map land cover. The resulting ultra‐high‐resolution (sub‐decimeter‐resolution) imagery has high information content, but automating the extraction of this information to create accurate, wall‐to‐wall land‐cover maps is quite difficult. We introduce image‐processing workflows that are based on open‐source software and can be used to create land‐cover maps from ultra‐high‐resolution aerial imagery. We compared four machine‐learning workflows for classifying images. Two workflows were based on random forest algorithms. Of these, one used a pixel‐by‐pixel approach available in ilastik, and the other used image segments and was implemented with R and the Orfeo ToolBox. The other two workflows used fully connected neural networks and convolutional neural networks implemented with Nenetic. We applied the four workflows to aerial photographs acquired in the Great Basin (western USA ) at flying heights of 10 m, 45 m and 90 m above ground level. Our focal cover type was cheatgrass ( Bromus tectorum ), a non‐native invasive grass that changes regional fire dynamics. The most accurate workflow for classifying ultra‐high‐resolution imagery depends on diverse factors that are influenced by image resolution and land‐cover characteristics, such as contrast, landscape patterns and the spectral texture of the land‐cover types being classified. For our application, the ilastik workflow yielded the highest overall accuracy (0.82–0.89) as assessed by pixel‐based accuracy.