z-logo
open-access-imgOpen Access
Mask R‐CNN‐based feature extraction and three‐dimensional recognition of rice panicle CT images
Author(s) -
Kong Huihua,
Chen Ping
Publication year - 2021
Publication title -
plant direct
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.211
H-Index - 11
ISSN - 2475-4455
DOI - 10.1002/pld3.323
Subject(s) - panicle , artificial intelligence , pattern recognition (psychology) , feature (linguistics) , feature extraction , computer science , convolutional neural network , mathematics , euclidean distance , horticulture , biology , philosophy , linguistics
The rice panicle seed setting rate is extremely important for calculating rice yield and performing genetic analysis. Unlike machine vision, X‐ray computed tomography (CT) imaging is a nondestructive technique that provides direct information on the internal and external structure of rice panicles. However, occlusion and adhesion of panicles and grains in a CT image sequence make these objects difficult to identify, which in turn hinders accurate determination of the seed setting rate of rice panicles. Therefore, this paper proposes a method based on a mask region convolutional neural network (Mask R‐CNN) for feature extraction and three‐dimensional (3‐D) recognition of CT images of rice panicles. X‐ray CT feature characterization was combined with the Mask R‐CNN algorithm to perform feature extraction and classification of a panicle and grains in each layer of the CT sequence. The Euclidean distance between adjacent layers was minimized to extract the features of a 3‐D panicle and grains. The results were used to calculate the rice panicle seed setting rate. The proposed method was experimentally verified using eight sets of different rice panicles. The results showed that the proposed method can efficiently identify and count plump grains and blighted grains to achieve an accuracy above 99% for the seed setting rate.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here