z-logo
Premium
Investigating the Effect of Item Position in Computer‐Based Tests
Author(s) -
Li Feiming,
Cohen Allan,
Shen Linjun
Publication year - 2012
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/j.1745-3984.2012.00181.x
Subject(s) - rasch model , item response theory , position (finance) , licensure , equating , copying , test (biology) , statistics , psychology , computer science , econometrics , psychometrics , mathematics , medicine , paleontology , finance , biology , political science , law , economics , nursing
Computer‐based tests (CBTs) often use random ordering of items in order to minimize item exposure and reduce the potential for answer copying. Little research has been done, however, to examine item position effects for these tests. In this study, different versions of a Rasch model and different response time models were examined and applied to data from a CBT administration of a medical licensure examination. The models specifically were used to investigate whether item position affected item difficulty and item intensity estimates. Results indicated that the position effect was negligible.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here