
Reliability and Validity of Clinician ECG Interpretation for Athletes
Author(s) -
Magee Charles,
Kazman Joshua,
Haigney Mark,
Oriscello Ralph,
DeZee Kent J.,
Deuster Patricia,
Depenbrock Patrick,
O'Connor Francis G.
Publication year - 2014
Publication title -
annals of noninvasive electrocardiology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.494
H-Index - 48
eISSN - 1542-474X
pISSN - 1082-720X
DOI - 10.1111/anec.12138
Subject(s) - medicine , kappa , inter rater reliability , specialty , cohen's kappa , athletes , diagnostic accuracy , reliability (semiconductor) , primary care , cardiology , physical therapy , family medicine , rating scale , machine learning , psychology , developmental psychology , philosophy , linguistics , power (physics) , physics , quantum mechanics , computer science
Background Electrocardiogram (ECG) with preparticipation evaluation (PPE) for athletes remains controversial in the United States and diagnostic accuracy of clinician ECG interpretation is unclear. This study aimed to assess reliability and validity of clinician ECG interpretation using expert‐validated ECGs according to the 2010 European Society of Cardiology (ESC) interpretation criteria. Methods This is a blinded, prospective study of diagnostic accuracy of clinician ECG interpretation. Anonymized ECGs were validated for normal and abnormal patterns by blinded expert interpreters according to the ESC interpretation criteria from October 2011 through March 2012. Six pairs of clinician interpreters were recruited from relevant clinical specialties in an academic medical center in March 2012. Each clinician interpreted 85 ECGs according to the ESC interpretation guidelines. Cohen and Fleiss’ kappa, sensitivity, and specificity were calculated within specialties and across primary care and cardiology specialty groups. Results Experts interpreted 189 ECGs yielding a kappa of 0.63, demonstrating “substantial” inter‐rater agreement. A total of 85 validated ECGs, including 26 abnormals, were selected for clinician interpretation. The kappa across cardiology specialists was “substantial” and “moderate” across primary care (0.69 vs 0.52, respectively, P < 0.001). Sensitivity and specificity to detect abnormal patterns were similar between cardiology and primary care groups (sensitivity 93.3% vs 81.3%, respectively, P = 0.31; specificity 88.8% vs 89.8%, respectively, P = 0.91). Conclusions Clinician ECG interpretation according to the ESC interpretation criteria appears to demonstrate limited reliability and validity. Before widespread adoption of ECG for PPE of U.S. athletes, further research of training focused on improved reliability and validity of clinician ECG interpretation is warranted.