Premium
Direct Observation Assessment of Ultrasound Competency Using a Mobile Standardized Direct Observation Tool Application With Comparison to Asynchronous Quality Assurance Evaluation
Author(s) -
Boniface Keith S.,
Ogle Kat,
Aalam Ahmad,
LeSaux Maxine,
Pyle Matt,
Mandoorah Sohaib,
Shokoohi Hamid
Publication year - 2019
Publication title -
aem education and training
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.49
H-Index - 9
ISSN - 2472-5390
DOI - 10.1002/aet2.10324
Subject(s) - medicine , quality assurance , medical physics , observational study , external quality assessment , pathology
Abstract Objectives Competency assessment is a key component of point‐of‐care ultrasound ( POCUS ) training. The purpose of this study was to design a smartphone‐based standardized direct observation tool ( SDOT ) and to compare a faculty‐observed competency assessment at the bedside with a blinded reference standard assessment in the quality assurance ( QA ) review of ultrasound images. Methods In this prospective, observational study, an SDOT was created using SurveyMonkey containing specific scoring and evaluation items based on the Council of Emergency Medicine Residency‐Academy of Emergency Ultrasound: Consensus Document for the Emergency Ultrasound Milestone Project. Ultrasound faculty used the mobile phone–based data collection tool as an SDOT at the bedside when students, residents, and fellows were performing one of eight core POCUS examinations. Data recorded included demographic data, examination‐specific data, and overall quality measures (on a scale of 1–5, with 3 and above being defined as adequate for clinical decision making), as well as interpretation and clinical knowledge. The POCUS examination itself was recorded and uploaded to QP ath, a HIPAA ‐compliant ultrasound archive. Each examination was later reviewed by another faculty blinded to the result of the bedside evaluation. The agreement of examinations scored adequate (3 and above) in the two evaluation methods was the primary outcome. Results A total of 163 direct observation evaluations were collected from 23 EM residents (93 SDOT s [57%]), 14 students (51 SDOT s [31%]), and four fellows (19 SDOT s [12%]). The trainees were evaluated on completing cardiac (54 [33%]), focused assessment with sonography for trauma (34 [21%]), biliary (25 [15%]), aorta (18 [11%]), renal (12 [7%]), pelvis (eight [5%]), deep vein thrombosis (seven [4%]), and lung scan (5 [3%]). Overall, the number of observed agreements between bedside and QA assessments was 81 (87.1% of the observations) for evaluating the quality of images (scores 1 and 2 vs. scores 3, 4, and 5). The strength of agreement is considered to be “fair” (κ = 0.251 and 95% confidence interval [ CI ] = 0.02–0.48). Further agreement assessment demonstrated a fair agreement for images taken by residents and students and a “perfect” agreement in images taken by fellows. Overall, a “moderate” inter‐rater agreement was found in 79.1% for the accuracy of interpretation of POCUS scan (e.g., true positive, false negative) during QA and bedside evaluation (κ = 0.48, 95% CI = 0.34–0.63). Faculty at the bedside and QA assessment reached a moderate agreement on interpretations noted by residents and students and a “good” agreement on fellows’ scans. Conclusion Using a bedside SDOT through a mobile SurveyMonkey platform facilitates assessment of competency in emergency ultrasound learners and correlates well with traditional competency evaluation by asynchronous weekly image review QA .