z-logo
open-access-imgOpen Access
Digital staining through the application of deep neural networks to multi-modal multi-photon microscopy
Author(s) -
Navid Borhani,
Andrew J. Bower,
Stephen A. Boppart,
Demetri Psaltis
Publication year - 2019
Publication title -
biomedical optics express
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.362
H-Index - 86
ISSN - 2156-7085
DOI - 10.1364/boe.10.001339
Subject(s) - microscopy , two photon excitation microscopy , optics , haematoxylin , microscope , eosin , optical sectioning , modal , fluorescence microscope , computer science , artificial intelligence , materials science , computer vision , biomedical engineering , staining , fluorescence , physics , pathology , medicine , polymer chemistry
Deep neural networks have been used to map multi-modal, multi-photon microscopy measurements of a label-free tissue sample to its corresponding histologically stained brightfield microscope colour image. It is shown that the extra structural and functional contrasts provided by using two source modes, namely two-photon excitation microscopy and fluorescence lifetime imaging, result in a more faithful reconstruction of the target haematoxylin and eosin stained mode. This modal mapping procedure can aid histopathologists, since it provides access to unobserved imaging modalities, and translates the high-dimensional numerical data generated by multi-modal, multi-photon microscopy into traditionally accepted visual forms. Furthermore, by combining the strengths of traditional chemical staining and modern multi-photon microscopy techniques, modal mapping enables label-free, non-invasive studies of in vivo tissue samples or intravital microscopic imaging inside living animals. The results show that modal co-registration and the inclusion of spatial variations increase the visual accuracy of the mapped results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here