z-logo
open-access-imgOpen Access
Automated identification of cone photoreceptors in adaptive optics optical coherence tomography images using transfer learning
Author(s) -
Morgan Heisler,
Myeong Jin Ju,
Mahadev Bhalla,
Nathan Schuck,
Arman Athwal,
Eduardo V. Navajas,
Mirza Faisal Beg,
Marinko V. Sarunic
Publication year - 2018
Publication title -
biomedical optics express
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.362
H-Index - 86
ISSN - 2156-7085
DOI - 10.1364/boe.9.005353
Subject(s) - optical coherence tomography , computer science , artificial intelligence , adaptive optics , computer vision , optics , convolutional neural network , image processing , cone (formal languages) , identification (biology) , pattern recognition (psychology) , physics , image (mathematics) , algorithm , botany , biology
Automated measurements of the human cone mosaic requires the identification of individual cone photoreceptors. The current gold standard, manual labeling, is a tedious process and can not be done in a clinically useful timeframe. As such, we present an automated algorithm for identifying cone photoreceptors in adaptive optics optical coherence tomography (AO-OCT) images. Our approach fine-tunes a pre-trained convolutional neural network originally trained on AO scanning laser ophthalmoscope (AO-SLO) images, to work on previously unseen data from a different imaging modality. On average, the automated method correctly identified 94% of manually labeled cones when compared to manual raters, from twenty different AO-OCT images acquired from five normal subjects. Voronoi analysis confirmed the general hexagonal-packing structure of the cone mosaic as well as the general cone density variability across portions of the retina. The consistency of our measurements demonstrates the high reliability and practical utility of having an automated solution to this problem.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here