Premium
Intelligent microscopic approach for identification and recognition of citrus deformities
Author(s) -
Safdar Arooj,
Khan Muhammad A.,
Shah Jamal H.,
Sharif Muhammad,
Saba Tanzila,
Rehman Amjad,
Javed Kashif,
Khan Junaid A.
Publication year - 2019
Publication title -
microscopy research and technique
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.536
H-Index - 118
eISSN - 1097-0029
pISSN - 1059-910X
DOI - 10.1002/jemt.23320
Subject(s) - preprocessor , computer science , artificial intelligence , principal component analysis , pattern recognition (psychology) , dimensionality reduction , identification (biology) , transformation (genetics) , support vector machine , feature extraction , segmentation , computer vision , biochemistry , chemistry , botany , biology , gene
Plant diseases are accountable for economic losses in an agricultural country. The manual process of plant diseases diagnosis is a key challenge from last one decade; therefore, researchers in this area introduced automated systems. In this research work, automated system is proposed for citrus fruit diseases recognition using computer vision technique. The proposed method incorporates five fundamental steps such as preprocessing, disease segmentation, feature extraction and reduction, fusion, and classification. The noise is being removed followed by a contrast stretching procedure in the very first phase. Later, watershed method is applied to excerpt the infectious regions. The shape, texture, and color features are subsequently computed from these infection regions. In the fourth step, reduced features are fused using serial‐based approach followed by a final step of classification using multiclass support vector machine. For dimensionality reduction, principal component analysis is utilized, which is a statistical procedure that enforces an orthogonal transformation on a set of observations. Three different image data sets (Citrus Image Gallery, Plant Village, and self‐collected) are combined in this research to achieving a classification accuracy of 95.5%. From the stats, it is quite clear that our proposed method outperforms several existing methods with greater precision and accuracy.