Premium
Evaluation of self‐directed clinical education: validation of an instrument
Author(s) -
Dornan T,
Boshuizen H,
Cordingley L,
Hider S,
Hadfield J,
Scherpbier A
Publication year - 2004
Publication title -
medical education
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.776
H-Index - 138
eISSN - 1365-2923
pISSN - 0308-0110
DOI - 10.1111/j.1365-2929.2004.01837.x
Subject(s) - psychology , medical education , educational measurement , medical physics , medicine , curriculum , pedagogy
Aim To explore the evaluation of self‐directed, integrated clinical education. Methods We delivered a quantitative and qualitative, self‐report questionnaire to students through their web‐based learning management system. The questionnaire was distributed 4 times over 1 year, each time in 2 parts. A generic part evaluated boundary conditions for learning, teaching activities and ‘real patient learning’. Factor analysis with varimax rotation was used to validate the constructs that made up the scale and to stimulate hypotheses about how they interrelated. A module‐specific part evaluated real patient learning of the subject matter in the curriculum. Results A total of 101 students gave 380 of a possible 404 responses (94%). The generic data loaded onto 4 factors, corresponding to: firm quality; hospital‐based teaching and learning; community and out‐patient learning, and problem‐based learning (PBL). A 5‐item quality index had content, construct and criterion validity. Quality differed greatly between firms. Self‐evaluation of module‐specific, real patient learning was also valid. It was strongly influenced by the specialty interests of hospital firms. Conclusions Quality is a multidimensional construct. Self‐report evaluation of real patient learning is feasible, and could be capitalised on to promote reflective self‐direction. The social and material context of learning is an important dimension of educational quality.