Premium
Classification of stomach infections: A paradigm of convolutional neural network along with classical features fusion and selection
Author(s) -
Majid Abdul,
Khan Muhammad Attique,
Yasmin Mussarat,
Rehman Amjad,
Yousafzai Abdullah,
Tariq Usman
Publication year - 2020
Publication title -
microscopy research and technique
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.536
H-Index - 118
eISSN - 1097-0029
pISSN - 1059-910X
DOI - 10.1002/jemt.23447
Subject(s) - pattern recognition (psychology) , artificial intelligence , computer science , convolutional neural network , feature extraction , feature selection , classifier (uml)
Automated detection and classification of gastric infections (i.e., ulcer, polyp, esophagitis, and bleeding) through wireless capsule endoscopy (WCE) is still a key challenge. Doctors can identify these endoscopic diseases by using the computer‐aided diagnostic (CAD) systems. In this article, a new fully automated system is proposed for the recognition of gastric infections through multi‐type features extraction, fusion, and robust features selection. Five key steps are performed—database creation, handcrafted and convolutional neural network (CNN) deep features extraction, a fusion of extracted features, selection of best features using a genetic algorithm (GA), and recognition. In the features extraction step, discrete cosine transform, discrete wavelet transform strong color feature, and VGG16‐based CNN features are extracted. Later, these features are fused by simple array concatenation and GA is performed through which best features are selected based on K‐Nearest Neighbor fitness function. In the last, best selected features are provided to Ensemble classifier for recognition of gastric diseases. A database is prepared using four datasets—Kvasir, CVC‐ClinicDB, Private, and ETIS‐LaribPolypDB with four types of gastric infections such as ulcer, polyp, esophagitis, and bleeding. Using this database, proposed technique performs better as compared to existing methods and achieves an accuracy of 96.5%.