Open Access
Stable polyp‐scene classification via subsampling and residual learning from an imbalanced large dataset
Author(s) -
Itoh Hayato,
Roth Holger,
Oda Masahiro,
Misawa Masashi,
Mori Yuichi,
Kudo ShinEi,
Mori Kensaku
Publication year - 2019
Publication title -
healthcare technology letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.45
H-Index - 19
ISSN - 2053-3713
DOI - 10.1049/htl.2019.0079
Subject(s) - residual , computer science , artificial intelligence , pattern recognition (psychology) , contextual image classification , computer vision , machine learning , image (mathematics) , algorithm
This Letter presents a stable polyp‐scene classification method with low false positive (FP) detection. Precise automated polyp detection during colonoscopies is essential for preventing colon‐cancer deaths. There is, therefore, a demand for a computer‐assisted diagnosis (CAD) system for colonoscopies to assist colonoscopists. A high‐performance CAD system with spatiotemporal feature extraction via a three‐dimensional convolutional neural network (3D CNN) with a limited dataset achieved about 80% detection accuracy in actual colonoscopic videos. Consequently, further improvement of a 3D CNN with larger training data is feasible. However, the ratio between polyp and non‐polyp scenes is quite imbalanced in a large colonoscopic video dataset. This imbalance leads to unstable polyp detection. To circumvent this, the authors propose an efficient and balanced learning technique for deep residual learning. The authors’ method randomly selects a subset of non‐polyp scenes whose number is the same number of still images of polyp scenes at the beginning of each epoch of learning. Furthermore, they introduce post‐processing for stable polyp‐scene classification. This post‐processing reduces the FPs that occur in the practical application of polyp‐scene classification. They evaluate several residual networks with a large polyp‐detection dataset consisting of 1027 colonoscopic videos. In the scene‐level evaluation, their proposed method achieves stable polyp‐scene classification with 0.86 sensitivity and 0.97 specificity.