
Integrating Cognitive Views Into Psychometric Models for Reading Comprehension Assessment
Author(s) -
Rahman Taslima,
Mislevy Robert J.
Publication year - 2017
Publication title -
ets research report series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.235
H-Index - 5
ISSN - 2330-8516
DOI - 10.1002/ets2.12163
Subject(s) - reading comprehension , item response theory , comprehension , task (project management) , psychology , reading (process) , construct (python library) , cognitive psychology , covariate , test (biology) , cognition , psychometrics , classical test theory , construct validity , multiple choice , rasch model , computer science , natural language processing , linguistics , machine learning , developmental psychology , paleontology , philosophy , management , neuroscience , economics , biology , programming language
To demonstrate how methodologies for assessing reading comprehension can grow out of views of the construct suggested in the reading research literature, we constructed tasks and carried out psychometric analyses that were framed in accordance with 2 leading reading models. In estimating item difficulty and subsequently, examinee proficiency, an item response theory (IRT) model called the linear logistic test model was extended to incorporate reader as well as task attributes as covariates. A novel aspect of this modeling was reader effects—interest and prior knowledge—specific to text passages that the examinees read in the assessment. In the demonstration, the theory‐motivated task and reader attributes were found to be significantly related to item difficulty. In particular, examinees' comprehension proficiency estimates positively affected within‐person effects concerning the reader's familiarity and interest in a passage. This study suggests that it is both feasible and informative to incorporate variables for various comprehension components into the psychometric analysis.