z-logo
open-access-imgOpen Access
CONSTRUCTED‐RESPONSE DIF EVALUATIONS FOR MIXED‐FORMAT TESTS
Author(s) -
Moses Tim,
Liu Jinghua,
Tan Adele,
Deng Weiling,
Dorans Neil J.
Publication year - 2013
Publication title -
ets research report series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.235
H-Index - 5
ISSN - 2330-8516
DOI - 10.1002/j.2333-8504.2013.tb02340.x
Subject(s) - differential item functioning , matching (statistics) , statistics , item response theory , similarity (geometry) , psychology , test (biology) , mathematics , psychometrics , econometrics , computer science , artificial intelligence , paleontology , image (mathematics) , biology
In this study, differential item functioning (DIF) methods utilizing 14 different matching variables were applied to assess DIF in the constructed‐response (CR) items from 6 forms of 3 mixed‐format tests. Results suggested that the methods might produce distinct patterns of DIF results for different tests and testing programs, in that the DIF methods' results might be similar for tests with multiple‐choice (MC) and CR scores that are similar in their measurement characteristics but would exhibit larger variations for tests with MC and CR scores having more distinct measurement characteristics. Impact measures of the MC and CR scores appeared to be a useful basis for indicating the scores' measurement similarity, for predicting the variations of DIF results from using these scores as matching variables, and possibly for indicating the most appropriate DIF method and matching variable for a particular test. The results are described in terms of their implications for research and practice.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here