z-logo
open-access-imgOpen Access
Robust optical flow algorithm for general single cell segmentation
Author(s) -
Michael C. Robitaille,
Jeff M. Byers,
Joseph A. Christodoulides,
Marc P. Raphael
Publication year - 2022
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0261763
Subject(s) - segmentation , computer science , artificial intelligence , optical flow , image segmentation , scale space segmentation , algorithm , computer vision , segmentation based object categorization , machine learning , pattern recognition (psychology) , image (mathematics)
Cell segmentation is crucial to the field of cell biology, as the accurate extraction of single-cell morphology, migration, and ultimately behavior from time-lapse live cell imagery are of paramount importance to elucidate and understand basic cellular processes. In an effort to increase available segmentation tools that can perform across research groups and platforms, we introduce a novel segmentation approach centered around optical flow and show that it achieves robust segmentation of single cells by validating it on multiple cell types, phenotypes, optical modalities, and in-vitro environments with or without labels. By leveraging cell movement in time-lapse imagery as a means to distinguish cells from their background and augmenting the output with machine vision operations, our algorithm reduces the number of adjustable parameters needed for manual optimization to two. We show that this approach offers the advantage of quicker processing times compared to contemporary machine learning based methods that require manual labeling for training, and in most cases achieves higher quality segmentation as well. This algorithm is packaged within MATLAB, offering an accessible means for general cell segmentation in a time-efficient manner.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here