
EQUATING OF MIXED‐FORMAT TESTS IN LARGE‐SCALE ASSESSMENTS
Author(s) -
Kim Sooyeon,
Walker Michael E.,
McHale Frederick
Publication year - 2008
Publication title -
ets research report series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.235
H-Index - 5
ISSN - 2330-8516
DOI - 10.1002/j.2333-8504.2008.tb02112.x
Subject(s) - equating , statistics , scale (ratio) , computer science , test (biology) , mathematics , paleontology , physics , quantum mechanics , rasch model , biology
This study examined variations of the nonequivalent‐groups equating design for mixed‐format tests—tests containing both multiple‐choice (MC) and constructed‐response (CR) items—to determine which design was most effective in producing equivalent scores across the two tests to be equated. Four linking designs were examined: (a) an anchor with only MC items; (b) a mixed‐format anchor containing both MC and CR items; (c) a mixed‐format anchor incorporating CR item rescoring; and (d) a hybrid combining single‐group and equivalent‐groups designs, thereby avoiding the need for an anchor test. Designs using MC items alone or those using a mixed anchor without CR item rescoring resulted in much larger bias than the other two design approaches. The hybrid design yielded the smallest root mean squared error value.