Premium
Glaucoma assessment from color fundus images using convolutional neural network
Author(s) -
Elangovan Poonguzhali,
Nath Malaya Kumar
Publication year - 2021
Publication title -
international journal of imaging systems and technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.359
H-Index - 47
eISSN - 1098-1098
pISSN - 0899-9457
DOI - 10.1002/ima.22494
Subject(s) - convolutional neural network , artificial intelligence , computer science , fundus (uterus) , pattern recognition (psychology) , discriminative model , deep learning , glaucoma , computer vision , ophthalmology , medicine
Abstract Early detection and proper screening are essential to prevent vision loss due to glaucoma. In recent years, convolutional neural network (CNN) has been successfully applied to the color fundus images for the automatic detection of glaucoma. Compared to the existing automatic screening methods, CNNs have the ability to extract the distinct features directly from the fundus images. In this paper, a deep learning architecture based on a CNN is designed for the classification of glaucomatous and normal fundus images. An 18–layer CNN is designed and trained to extract the discriminative features from the fundus image. It comprises of four convolutional layers, two max pooling layers, and one fully connected layer. A two–stage tuning approach is proposed for the selection of suitable batch size and initial learning rate. The proposed network is tested on DRISHTI–GS1, ORIGA, RIM–ONE2 (release 2), ACRIMA, and large–scale attention–based glaucoma (LAG) databases. Rotation data augmentation technique is employed to enlarge the dataset. Randomly selected 70% of images are used for training the model and remaining 30% images are used for testing. An overall accuracy of 86.62%, 85.97%, 78.32%, 94.43%, and 96.64% are obtained on DRISHTI–GS1, RIM–ONE2, ORIGA, LAG, and ACRIMA databases, respectively. The proposed method has achieved an accuracy, sensitivity, specificity, and precision of 96.64%, 96.07%, 97.39%, and 97.74%, respectively, for ACRIMA database. Compared to other existing architectures, the proposed method is robust to Gaussian noise and salt–and–pepper noise.