z-logo
open-access-imgOpen Access
Rice Disease Classification using Deep Convolutional Neural Network
Author(s) -
Tanya Shrivastava,
M. Radhakrishna Pillai,
B. Baranidharan
Publication year - 2020
Publication title -
international journal of innovative technology and exploring engineering
Language(s) - English
Resource type - Journals
ISSN - 2278-3075
DOI - 10.35940/ijitee.d1112.029420
Subject(s) - convolutional neural network , artificial intelligence , computer science , deep learning , pattern recognition (psychology) , machine learning , food security , contextual image classification , residual neural network , population , agriculture , image (mathematics) , geography , demography , archaeology , sociology
For an Agro-based country like India where agriculture acts as a main source of livelihood for more than 50% of the population, crop diseases are a major threat to food security. Hence, digital image processing along with proper machine learning algorithms can be utilized for the classification of diseases from the images of a plant. In this paper, a comparative study on the effects of different machine learning models on crop disease prediction has been done. Since Convolutional Neural Network (CNN) proved to be the best for image classification techniques, models based on CNN alone were considered in this study. We compared the performance of smallCNN with three pre-trained CNN models namely, AlexNet, ResNet-50, and VGG-16. SmallCNN is the CNN model built by us with fewer parameters and suitable for small datasets. The crop tested in this research is Oryza Sativa (Asian Rice) commonly referred to as paddy which is cultivated in abundance in India. The input dataset was fed into the model after performing appropriate pre-processing techniques followed by segmentation. The best accuracy of 66.67% was achieved in the case of ResNet-50 with Adam as the optimizer at a learning rate of 0.0001.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here