z-logo
open-access-imgOpen Access
Label2label: training a neural network to selectively restore cellular structures in fluorescence microscopy
Author(s) -
Lisa Sophie Kölln,
Omar Salem,
Jessica Valli,
Carsten Gram Hansen,
Gail McConnell
Publication year - 2022
Publication title -
journal of cell science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.384
H-Index - 278
eISSN - 1477-9137
pISSN - 0021-9533
DOI - 10.1242/jcs.258994
Subject(s) - convolutional neural network , immunofluorescence , biology , artificial intelligence , pattern recognition (psychology) , contrast (vision) , microscopy , computer science , physics , optics , antibody , immunology
Immunofluorescence microscopy is routinely used to visualise the spatial distribution of proteins that dictates their cellular function. However, unspecific antibody binding often results in high cytosolic background signals, decreasing the image contrast of a target structure. Recently, convolutional neural networks (CNNs) were successfully employed for image restoration in immunofluorescence microscopy, but current methods cannot correct for those background signals. We report a new method that trains a CNN to reduce unspecific signals in immunofluorescence images; we name this method label2label (L2L). In L2L, a CNN is trained with image pairs of two non-identical labels that target the same cellular structure. We show that after L2L training a network predicts images with significantly increased contrast of a target structure, which is further improved after implementing a multiscale structural similarity loss function. Here, our results suggest that sample differences in the training data decrease hallucination effects that are observed with other methods. We further assess the performance of a cycle generative adversarial network, and show that a CNN can be trained to separate structures in superposed immunofluorescence images of two targets.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here