z-logo
open-access-imgOpen Access
Qualification of Soybean Responses to Flooding Stress Using UAV-Based Imagery and Deep Learning
Author(s) -
Jing Zhou,
Huawei Mou,
Jianfeng Zhou,
Md Liakat Ali,
Heng Ye,
Pengyin Chen,
Henry T. Nguyen
Publication year - 2021
Publication title -
plant phenomics
Language(s) - English
Resource type - Journals
eISSN - 2097-0374
pISSN - 2643-6515
DOI - 10.34133/2021/9892570
Subject(s) - multispectral image , canopy , flooding (psychology) , environmental science , cultivar , abiotic component , flood myth , remote sensing , agronomy , mathematics , geography , biology , botany , ecology , psychology , archaeology , psychotherapist
Soybean is sensitive to flooding stress that may result in poor seed quality and significant yield reduction. Soybean production under flooding could be sustained by developing flood-tolerant cultivars through breeding programs. Conventionally, soybean tolerance to flooding in field conditions is evaluated by visually rating the shoot injury/damage due to flooding stress, which is labor-intensive and subjective to human error. Recent developments of field high-throughput phenotyping technology have shown great potential in measuring crop traits and detecting crop responses to abiotic and biotic stresses. The goal of this study was to investigate the potential in estimating flood-induced soybean injuries using UAV-based image features collected at different flight heights. The flooding injury score (FIS) of 724 soybean breeding plots was taken visually by breeders when soybean showed obvious injury symptoms. Aerial images were taken on the same day using a five-band multispectral and an infrared (IR) thermal camera at 20, 50, and 80 m above ground. Five image features, i.e., canopy temperature, normalized difference vegetation index, canopy area, width, and length, were extracted from the images at three flight heights. A deep learning model was used to classify the soybean breeding plots to five FIS ratings based on the extracted image features. Results show that the image features were significantly different at three flight heights. The best classification performance was obtained by the model developed using image features at 20 m with 0.9 for the five-level FIS. The results indicate that the proposed method is very promising in estimating FIS for soybean breeding.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom