z-logo
Premium
Comparative evaluation of performance measures for shading correction in time‐lapse fluorescence microscopy
Author(s) -
LIU L.,
KAN A.,
LECKIE C.,
HODGKIN P.D.
Publication year - 2017
Publication title -
journal of microscopy
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.569
H-Index - 111
eISSN - 1365-2818
pISSN - 0022-2720
DOI - 10.1111/jmi.12512
Subject(s) - shading , measure (data warehouse) , computer science , ground truth , range (aeronautics) , artificial intelligence , microscopy , computer vision , optics , data mining , materials science , physics , computer graphics (images) , composite material
Summary Time‐lapse fluorescence microscopy is a valuable technology in cell biology, but it suffers from the inherent problem of intensity inhomogeneity due to uneven illumination or camera nonlinearity, known as shading artefacts. This will lead to inaccurate estimates of single‐cell features such as average and total intensity. Numerous shading correction methods have been proposed to remove this effect. In order to compare the performance of different methods, many quantitative performance measures have been developed. However, there is little discussion about which performance measure should be generally applied for evaluation on real data, where the ground truth is absent. In this paper, the state‐of‐the‐art shading correction methods and performance evaluation methods are reviewed. We implement 10 popular shading correction methods on two artificial datasets and four real ones. In order to make an objective comparison between those methods, we employ a number of quantitative performance measures. Extensive validation demonstrates that the coefficient of joint variation (CJV) is the most applicable measure in time‐lapse fluorescence images. Based on this measure, we have proposed a novel shading correction method that performs better compared to well‐established methods for a range of real data tested.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here