
Plant Disease Classification using Lite Pretrained Deep Convolutional Neural Network on Android Mobile Device
Author(s) -
Burhanudin Syamsuri,
Gede Putra Kusuma*
Publication year - 2019
Publication title -
international journal of innovative technology and exploring engineering
Language(s) - English
Resource type - Journals
ISSN - 2278-3075
DOI - 10.35940/ijitee.b6647.129219
Subject(s) - convolutional neural network , computer science , android (operating system) , mobile device , deep learning , artificial intelligence , latency (audio) , mobile computing , pattern recognition (psychology) , real time computing , operating system , telecommunications
The implementation of image recognition in agriculture to detect symptoms of plant disease using deep learning Convolutional Neural Network (CNN) models are proven to be highly effective. The computational efficiency by using CNN, made possible to run the application on mobile device. To optimize the utilization of mobile device and choosing the most effective CNN model to run as detection system in mobile device with the highest accuracy and low resource consumption is proposed in this paper. In this study, PlantVillage dataset which extended to coffee leaf, were tested and compared using three CNN models, two models which specifically designed for mobile, MobileNet and Mobile Nasnet (MNasNet), and one model that recognized for its accuracy on personal computer (PC), InceptionV3. The experiment executed on both mobile and PC found a slightly degradation on accuracy when the application is running on mobile. InceptionV3 experienced the most persistence model compares to MNasNet and MobileNet. Yet, InceptionV3 had biggest latency time. The final result on mobile device recorded InceptionV3 achieved highest accuracy of 95.79%, MNasNet 94.87%, and MobileNet 92.83%, while for time latency MobileNet achieved the lowest with 394.70 ms, MNasnet 430.20 ms, and InceptionV3 2236.10 ms respectively. It is expected that the outcome of this study will be of great benefit to farmers as mobile image recognition would help them analyze the condition of their plants on site simply by taking a picture of the leaf and running the experiment on their mobile device