Open Access
Road Pothole Detection using Deep Learning Classifiers
Author(s) -
Surekha Arjapure,
D. R. Kalbande
Publication year - 2020
Publication title -
international journal of recent technology and engineering
Language(s) - English
Resource type - Journals
ISSN - 2277-3878
DOI - 10.35940/ijrte.f7349.038620
Subject(s) - pothole (geology) , convolutional neural network , python (programming language) , computer science , artificial intelligence , deep learning , artificial neural network , process (computing) , machine learning , pattern recognition (psychology) , data mining , petrology , geology , operating system
Pothole is one of the major types of defects frequently found on the road whose assessment is necessary to process. It is one of the important reason of accidents on the road along with the wear and tear of vehicles. Road defects assessment is to be done through defects data collection and processing of this collected data. Currently, using various types of imaging systems data collection is near about becomes automated but an assessment of defects from collected data is still manual. Manual classification and evaluation of potholes are expensive, labour-intensive, time-consuming and thus slows down the overall road maintenance process. This paper describe a method for classification and detection of the potholes on road images using convolutional neural networks which are deep learning algorithms. In the proposed system we used convolutional neural networks based approach with pre-trained models to classify given input images into a pothole and non-pothole categories. The method was implemented in python using OpenCV library under windows and colab environment, trained on 722 and tested on 116 raw images. The results are evaluated and compared for convolutional neural networks and various seven pre-trained models through accuracy, precision and recall metrics. The results show that pre-trained models InseptionResNetV2 and DenseNet201 can detect potholes on road images with reasonably good accuracy of 89.66%.