Premium
The objective structured clinical examination: can physician‐examiners participate from a distance?
Author(s) -
Chan James,
HumphreyMurto Susan,
Pugh Debra M,
Su Charles,
Wood Timothy
Publication year - 2014
Publication title -
medical education
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.776
H-Index - 138
eISSN - 1365-2923
pISSN - 0308-0110
DOI - 10.1111/medu.12326
Subject(s) - objective structured clinical examination , checklist , medical education , physical examination , oral examination , educational measurement , psychology , medicine , scale (ratio) , family medicine , curriculum , surgery , geography , cartography , pedagogy , cognitive psychology , oral health
Objectives Currently, a ‘pedagogical gap’ exists in distributed medical education in that distance educators teach medical students but typically do not have the opportunity to assess them in large‐scale examinations such as the objective structured clinical examination ( OSCE ). We developed a remote examiner OSCE (re OSCE ) that was integrated into a traditional OSCE to establish whether remote examination technology may be used to bridge this gap. The purpose of this study was to explore whether remote physician‐examiners can replace on‐site physician‐examiners in an OSCE , and to determine the feasibility of this new examination method. Methods Forty Year 3 medical students were randomised into six re OSCE stations that were incorporated into two tracks of a 10‐station traditional OSCE . For the re OSCE stations, student performance was assessed by both a local examiner ( LE ) in the room and a remote examiner ( RE ) who viewed the OSCE encounters from a distance. The primary endpoint was the correlation of scores between LE s and RE s across all re OSCE stations. The secondary endpoint was a post‐ OSCE survey of both RE s and students. Results Statistically significant correlations were found between LE and RE checklist scores for history taking ( r = 0.64– r = 0.80), physical examination ( r = 0.41– r = 0.54), and management stations ( r = 0.78). Correlations between LE and RE global ratings were more varied ( r = 0.21– r = 0.77). Correlations on three of the six stations reached significance. Qualitative analysis of feedback from REs and students showed high acceptance of the re OSCE despite technological issues. Conclusions This preliminary study demonstrated that OSCE ratings by LE s and RE s were reasonably comparable when using checklists. Remote examination may be a feasible and acceptable way of assessing students' clinical skills, but further validity evidence will be required before it can be recommended for use in high‐stakes examinations.