Research Library

open-access-imgOpen AccessResearch on leaf image identification based on improved AlexNet neural network
Author(s)
Wenkun Zhang,
Wen Juanjuan
Publication year2021
Publication title
journal of physics. conference series
Resource typeJournals
PublisherIOP Publishing
The results of plant leaf classification can be used in many fields, such as plant protection. In addition, it also plays a certain role in the spread of plant diversity knowledge. At present, the recognition of plants by leaves has become a popular issue in many fields. This paper focuses more on what kind of network structure to choose to identify plant leaves. Convolution neural network is a popular network structure in recent years. Features are extracted by convolution layer, feature dimension reduction is completed by pooling layer, and finally a column vector is output by fully connected layer, and leaf recognition and classification are completed by classifier. Therefore, we construct an 11 layer convolution neural network, and select plant leaves to complete the five classification of plant leaves, and try to modify the different parameters of the network structure, choosing various optimizer, the average accuracy of the final test set is more than 99%. From the experimental results, the model has high recognition rate and high accuracy for the existing blades, which has a certain practical significance.
Subject(s)artificial intelligence , artificial neural network , chemistry , classifier (uml) , computer science , construct (python library) , convolution (computer science) , convolutional neural network , layer (electronics) , organic chemistry , pattern recognition (psychology) , plant identification , pooling , programming language
Language(s)English
SCImago Journal Rank0.21
H-Index85
eISSN1742-6596
pISSN1742-6588
DOI10.1088/1742-6596/2031/1/012014

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here