Premium
Scoring a Performance‐Based Assessment by Modeling the Judgments of Experts
Author(s) -
Clauser Brian E.,
Subhiyah Raja G.,
Nungester Ronald J.,
Ripkey Douglas R.,
Clyman Stephen G.,
McKinley Danette
Publication year - 1995
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/j.1745-3984.1995.tb00474.x
Subject(s) - raw score , computer science , process (computing) , set (abstract data type) , sample (material) , raw data , machine learning , artificial intelligence , data mining , chemistry , chromatography , programming language , operating system
Performance assessments typically require expert judges to individually rate each performance. This results in a limitation in the use of such assessments because the rating process may be extremely time consuming. This article describes a scoring algorithm that is based on expert judgments but requires the rating of only a sample of performances. A regression‐based policy capturing procedure was implemented to model the judgment policies of experts. The data set was a seven‐case performance assessment of physician patient management skills. The assessment used a computer‐based simulation of the patient care environment. The results showed a substantial improvement in correspondence between scores produced using the algorithm and actual ratings, when compared to raw scores. Scores based on the algorithm were also shown to be superior to raw scores and equal to expert ratings for making pass/fail decisions which agreed with those made by an independent committee of experts