Premium
Using Invariance to Examine Cheating in Unproctored Ability Tests
Author(s) -
Wright Natalie A.,
Meade Adam W.,
Gutierrez Sara L.
Publication year - 2014
Publication title -
international journal of selection and assessment
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.812
H-Index - 61
eISSN - 1468-2389
pISSN - 0965-075X
DOI - 10.1111/ijsa.12053
Subject(s) - cheating , psychology , equivalence (formal languages) , test (biology) , social psychology , selection (genetic algorithm) , differential item functioning , measurement invariance , applied psychology , psychometrics , clinical psychology , item response theory , statistics , structural equation modeling , artificial intelligence , computer science , mathematics , confirmatory factor analysis , paleontology , discrete mathematics , biology
Despite their widespread use in personnel selection, there is concern that cheating could undermine the validity of unproctored Internet‐based tests. This study examined the presence of cheating in a speeded ability test used for personnel selection. The same test was administered to applicants in either proctored or unproctored conditions. Item response theory differential functioning analyses were used to evaluate the equivalence of the psychometric properties of test items across proctored and unproctored conditions. A few items displayed different psychometric properties, and the nature of these differences was not uniform. Theta scores were not reflective of widespread cheating among unproctored examinees. Thus, results were not consistent with what would be expected if cheating on unproctored tests was pervasive.