Premium
Answering MCQs: a Study of Confidence Amongst Medical Students
Author(s) -
Rogers Michael S.,
Chung Tony,
Li Albert
Publication year - 1992
Publication title -
australian and new zealand journal of obstetrics and gynaecology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.734
H-Index - 65
eISSN - 1479-828X
pISSN - 0004-8666
DOI - 10.1111/j.1479-828x.1992.tb01925.x
Subject(s) - honour , obstetrics and gynaecology , medical education , subject (documents) , medicine , medical school , psychology , library science , law , political science , computer science , pregnancy , biology , genetics
EDITORIAL COMMENT : The techniques used to examine medical students in Obstetrics and Gynaecology, at least in the University of Melbourne, remained virtually unchanged until approximately 15 years ago, since when change has become the order of the day. This applies to the written examination (formerly 3, now 1), the clinical and oral examinations, (formerly 4, now 1), the assessment of the student by the hospital consultants, and last but not least the decision of the Faculty of Medicine to award an Honours grade to 40–50% of the candidates. As our former famous Dean of the Faculty of Medicine was fond of saying, ‘All change is not progress’. The old system of frugality with honours should not be defended because it provided no encouragement for the battler to aspire to an honours mark, yet for 40–50% to earn an honour/distinction requires examiners to train themselves to give an honours mark to the majority of the year in the individual encounter, unless one favours a mathematical conversion of scores to reach a predetermined proportion of honours candidates. Medical students do not seem to have changed much over the years in the editor's opinion, and he has examined in obstetrics and gynaecology for 28 years ‐ sadly he has seen what was a final year subject, properly grouped with Medicine and Surgery, relegated to a 10‐week block in the penultimate year of the course, thus providing a guarantee that the average student will have forgotten most of what he/she has learned about obstetrics, gynaecology and neonatal paediatrics before graduation, a lamentable state of affairs. All of the systems the editor has seen in operation seemed to readily identify the best and the worst students, with the majority being grouped en masse well above the pass mark but far short of the mark of excellence. In the 1960's the mutliple choice examination was introduced in the University of Melbourne clinical examinations in tandem with the old system for purposes of comparison, and it was found to be most satisfactory selecting the stars and drones with equal certitude. However, the maintenance of banks was a problem, as was the need to review questions and answers as opinion changed with time and clinical progress, and what was accepted by the team of examiners as the correct answer, became incorrect. There was also the possibility of a leak from the bank of questions, although some examiners, including the writer, believed that if the bank was very large, then it could officially be made available to all students, before commencing their clerkship in the discipline. The multiple choice paper was then superceded by the short answer paper which although giving a consistently lower score than the multiple choice method, also seemed to sort out the best and worst students from the majority inbetweeners. We accepted this paper for publication because all readers should retain at least a ‘passing interest’ in methods of examination ‐ how many readers can remember the examination encounter of years ago with frightening clarity, or still dream of searching a posted list of examination numbers to see if their number had failed to be included! The authors note that there is one major weakness of a multiple choice questionnaire. Namely, there is a reasonable probability that a candidate who takes a gamble and answers as many questions as possible without thinking will improve his/her overall mark, regardless of the degree of certainty that the answer selected was correct. The candidate who is more thoughtful and who answers a MCQ slowly and logically may suffer. It is the editor's opinion that although written examinations are an essential part of the examination system, so too is the ‘clinical’ examination, which sadly is now conducted without the candidate having examined a patient. Indeed some centres practise the farce where the examiner has to act out the part of the patient as the student is examined to assess his/her skills in communication. This often proves to be an examination in acting with both candidate and examiner being embarrassed. One of the problems is that a provision of ‘willing’ patients for clinical examinations is much more of a problem than formerly. This does not mean that the ‘Objective’ Structured Clinical Examination (OSCE) in its various forms is a suitable replacement. Although boards of examiners continue to strive to perfect a fair and objective system for evaluation of candidates it seems to the writer that for proper assessment of the clinicians of tomorrow, the clinicians of today must continue to support clinical examination where the judgement by the examiner of the candidate remains important. A computer‐driven marking system has its own inadequacies. Summary: Medical students at the beginning of their obstetrics and gynaecology module were asked to complete a multiple choice question paper from an earlier module. Half the students were asked to answer only those questions where they were certain of the answers. The other half were asked to answer all questions. The mean mark in the second group was 86% higher than that in the first group. A computer programme was written where multiple choice questions were asked in a standard true/false format but instead of a don't know alternative the students were asked to rate their degree of certainty in having answered correctly on a scale of 0 to 100. Five students completed a total of 45 multiple choice question papers (each with 20, 5‐part questions) both before and towards the end of their obstetrics and gynaecology module. The mean mark increased by 68.5% over the course of the module reflecting the students' increased knowledge. Their mean certainty level only increased by 50%, suggesting that the students underestimated their newly acquired knowledge