z-logo
Premium
Reviewing and Changing Answers on Computer‐adaptive and Self‐adaptive Vocabulary Tests
Author(s) -
Vispoel Walter P.
Publication year - 1998
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/j.1745-3984.1998.tb00542.x
Subject(s) - computerized adaptive testing , trustworthiness , vocabulary , psychology , test (biology) , statistics , mathematics education , social psychology , clinical psychology , mathematics , psychometrics , linguistics , philosophy , paleontology , biology
Results obtained from computer‐adaptive and self‐adaptive tests were compared under conditions in which item review was permitted and not permitted. Comparisons of answers before and after review within the “review” condition showed that a small percentage of answers was changed (5.23%), that more answers were changed from wrong to right than from right to wrong (by a ratio of 2.92:1), that most examinees (66.5%) changed answers to at least some questions, that most examinees who changed answers improved their ability estimates by doing so (by a ratio of 2.55 to 1), and that review was particularly beneficial to examineees at high ability levels. Comparisons between the “review” and “no‐review” conditions yielded no significant differences in ability estimates or in estimated measurement error and provided no trustworthy evidence that test anxiety moderated the effects of review on those indexes. Most examinees desired review, but permitting it increased testing time by 41%.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here