Premium
Choosing a new CD4 technology: Can statistical method comparison tools influence the decision?
Author(s) -
Scott Lesley E.,
Kestens Luc,
Pattanapanyasat Kovit,
Sukapirom Kasma,
Stevens Wendy S.
Publication year - 2017
Publication title -
cytometry part b: clinical cytometry
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.646
H-Index - 61
eISSN - 1552-4957
pISSN - 1552-4949
DOI - 10.1002/cyto.b.21522
Subject(s) - concordance , concordance correlation coefficient , similarity (geometry) , bland–altman plot , sensitivity (control systems) , standard deviation , statistics , accuracy and precision , computer science , data mining , mathematics , sample (material) , correlation coefficient , artificial intelligence , limits of agreement , nuclear medicine , medicine , chemistry , image (mathematics) , engineering , chromatography , electronic engineering
Background Method comparison tools are used to determine the accuracy, precision, agreement, and clinical relevance of a new or improved technology versus a reference technology. Guidelines for the most appropriate method comparison tools as well as their acceptable limits are lacking and not standardized for CD4 counting technologies. Methods Different method comparison tools were applied to a previously published CD4 dataset ( n = 150 data pairs) evaluating five different CD4 counting technologies (TruCOUNT, Dual Platform, FACSCount, Easy CD4, CyFlow) on a single specimen. Bland–Altman, percentage similarity, percent difference, concordance correlation, sensitivity, specificity and misclassification method comparison tools were applied as well as visualization of agreement with Passing Bablock and Bland–Altman scatter plots. Results The FACSCount (median CD4 = 245 cells/µl) was considered the reference for method comparison. An algorithm was developed using best practices of the most applicable method comparison tools, and together with a modified heat map was found useful for method comparison of CD4 qualitative and quantitative results. The algorithm applied the concordance correlation for overall accuracy and precision, then standard deviation of the absolute bias and percentage similarity coefficient of variation to identify agreement, and lastly sensitivity and misclassification rates for clinical relevance. Conclusion Combining method comparison tools is more useful in evaluating CD4 technologies compared to a reference CD4. This algorithm should be further validated using CD4 external quality assessment data and studies with larger sample sizes. © 2017 International Clinical Cytometry Society