Premium
Evaluating the impact of moving from discipline‐based to integrated assessment
Author(s) -
Hudson J N,
Tonkin A L
Publication year - 2004
Publication title -
medical education
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.776
H-Index - 138
eISSN - 1365-2923
pISSN - 0308-0110
DOI - 10.1111/j.1365-2929.2004.01893.x
Subject(s) - objective structured clinical examination , curriculum , cronbach's alpha , medical education , reliability (semiconductor) , psychology , validity , educational measurement , medicine , pedagogy , psychometrics , clinical psychology , power (physics) , physics , quantum mechanics
Background The move from discipline‐based to problem‐based learning (PBL) at Adelaide University in 2000 offered exciting opportunities to integrate the teaching and learning of the basic and clinical sciences for medical undergraduates. However, several cohorts of students still needed to progress through the first 3 years of the more traditional curriculum. Paradoxically, their readiness to function in the integrated learning and assessment environment of the last 3 years was assessed in 7 separate discipline‐based examinations at the end of third year. When considerable examination‐related stress was noted in the 1997 cohort and students petitioned formally for a reduced examination load, it was considered to be time for assessment to lead the way in integrating the disciplines. Aim After introducing third year integrated written assessments in 1998, we aimed to develop an integrated practical examination (IPE) linking theory to practice, and evaluate its impact on staff and students. Methods After extensive staff collaboration, a structured objective multistation IPE was developed and administered in 1999 and 2000. Its utility was evaluated using a model proposed earlier. Results Assessment validity was maximised by an extensive item review process. Reliability, as measured by Cronbach's alpha, was 0.79 and 0.80 in 1999 and 2000, respectively. An independent evaluation yielded qualitative data on the examination's educational impact, cost and acceptability. Conclusions Investing time in changing from discipline‐based to integrated assessment, integrating theory and practice, resulted in gains in assessment reliability, validity and educational impact on both staff and students.