z-logo
open-access-imgOpen Access
Learning Cell Nuclei Segmentation Using Labels Generated With Classical Image Analysis Methods
Author(s) -
Damian J. Matuszewski,
Peter Ranefall
Publication year - 2021
Publication title -
computer science research notes
Language(s) - English
Resource type - Conference proceedings
SCImago Journal Rank - 0.11
H-Index - 4
eISSN - 2464-4625
pISSN - 2464-4617
DOI - 10.24132/csrn.2021.3101.37
Subject(s) - jaccard index , computer science , convolutional neural network , pipeline (software) , artificial intelligence , segmentation , deep learning , bottleneck , pattern recognition (psychology) , sørensen–dice coefficient , dice , image segmentation , image (mathematics) , mathematics , geometry , programming language , embedded system
Creating manual annotations in a large number of images is a tedious bottleneck that limits deep learning use in many applications. Here, we present a study in which we used the output of a classical image analysis pipeline as labels when training a convolutional neural network (CNN). This may not only reduce the time experts spend annotating images but it may also lead to an improvement of results when compared to the output from the classical pipeline used in training. In our application, i.e., cell nuclei segmentation, we generated the annotations using CellProfiler (a tool for developing classical image analysis pipelines for biomedical applications) and trained on them a U-Net-based CNN model. The best model achieved a 0.96 dice-coefficient of the segmented Nuclei and a 0.84 object-wise Jaccard index which was better than the classical method used for generating the annotations by 0.02 and 0.34, respectively. Our experimental results show that in this application, not only such training is feasible but also that the deep learning segmentations are a clear improvement compared to the output from the classical pipeline used for generating the annotations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here