Premium
A Validity Framework for Evaluating the Technical Quality of Alternate Assessments
Author(s) -
Marion Scott F.,
Pellegrino James W.
Publication year - 2006
Publication title -
educational measurement: issues and practice
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.158
H-Index - 52
eISSN - 1745-3992
pISSN - 0731-1745
DOI - 10.1111/j.1745-3992.2006.00078.x
Subject(s) - quality (philosophy) , cronbach's alpha , structuring , construct (python library) , construct validity , quality assessment , computer science , psychology , cognition , evaluation methods , psychometrics , epistemology , engineering , clinical psychology , finance , neuroscience , reliability engineering , economics , programming language , philosophy
This article presents findings from two projects designed to improve evaluations of technical quality of alternate assessments for students with the most significant cognitive disabilities. We argue that assessment technical documents should allow for the evaluation of the construct validity of the alternate assessments following the traditions ofCronbach (1971),Messick (1989, 1995), Linn, Baker, and Dunbar (1991) , and Shepard (1993) . The projects used the work of Knowing What Students Know ( Pellegrino, Chudowsky, & Glaser, 2001 ) to structure and focus the collection and evaluation of assessment information. The heuristic of the assessment triangle ( Pellegrino et al., 2001 ) was particularly useful in emphasizing that the validity evaluation needs to consider the logical connections among the characteristics of the students tested and how they develop domain proficiency (the cognition vertex), the nature of the assessment (the observation vertex), and the ways in which the assessment results are interpreted (the interpretation vertex). This project has shown that in addition to designing more valid assessments, the growing body of knowledge about the psychology of achievement testing can be useful for structuring evaluations of technical quality.