Premium
Student‐written single‐best answer questions predict performance in finals
Author(s) -
Walsh Jason,
Harris Benjamin,
Tayyaba Saadia,
Harris David,
Smith Phil
Publication year - 2016
Publication title -
the clinical teacher
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.354
H-Index - 26
eISSN - 1743-498X
pISSN - 1743-4971
DOI - 10.1111/tct.12445
Subject(s) - summative assessment , formative assessment , test (biology) , medical education , incentive , cohort , quality (philosophy) , united states medical licensing examination , mathematics education , psychology , medicine , medical school , paleontology , philosophy , epistemology , economics , biology , microeconomics
Summary Background Single‐best answer ( SBA ) questions are widely used for assessment in medical schools; however, often clinical staff have neither the time nor the incentive to develop high‐quality material for revision purposes. A student‐led approach to producing formative SBA questions offers a potential solution. Methods Cardiff University School of Medicine students created a bank of SBA questions through a previously described staged approach, involving student question‐writing, peer‐review and targeted senior clinician input. We arranged questions into discrete tests and posted these online. Student volunteer performance on these tests from the 2012/13 cohort of final‐year medical students was recorded and compared with the performance of these students in medical school finals (knowledge and objective structured clinical examinations, OSCE s). In addition, we compared the performance of students that participated in question‐writing groups with the performance of the rest of the cohort on the summative SBA assessment. Often clinical staff have neither the time nor the incentive to develop high‐quality material for revision purposesResults Performance in the end‐of‐year summative clinical knowledge SBA paper correlated strongly with performance in the formative student‐written SBA test ( r = ~0.60, p <0.01). There was no significant correlation between summative OSCE scores and formative student‐written SBA test scores. Students who wrote and reviewed questions scored higher than average in the end‐of‐year summative clinical knowledge SBA paper. Conclusion Student‐written SBA s predict performance in end‐of‐year SBA examinations, and therefore can provide a potentially valuable revision resource. There is potential for student‐written questions to be incorporated into summative examinations.