z-logo
Premium
Validity Evidence for Learning Progression‐Based Assessment Items That Fuse Core Disciplinary Ideas and Science Practices
Author(s) -
Gotwals Amelia Wenk,
Songer Nancy Butler
Publication year - 2013
Publication title -
journal of research in science teaching
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.067
H-Index - 131
eISSN - 1098-2736
pISSN - 0022-4308
DOI - 10.1002/tea.21083
Subject(s) - argument (complex analysis) , discipline , psychology , think aloud protocol , science education , core (optical fiber) , evidence based practice , mathematics education , cognitive psychology , cognitive science , computer science , sociology , medicine , telecommunications , social science , biochemistry , chemistry , alternative medicine , pathology , usability , human–computer interaction
This article evaluates a validity argument for the degree to which assessment tasks are able to provide evidence about knowledge that fuses information from a progression of core disciplinary ideas in ecology and a progression for the scientific practice of developing evidence‐based explanations. The article describes the interpretive framework for the argument, including evidence for how well the assessment tasks are matched to the learning progressions and the methods for interpreting students' responses to the tasks. Findings from a dual‐pronged validity study that includes a think‐aloud analysis and an item difficulty analysis are presented as evidence. The findings suggest that the tasks provide opportunities for students at multiple ability levels to show evidence of both successes and struggles with the development of knowledge that fuses core disciplinary ideas with the scientific practice of developing evidence‐based explanations. In addition, these tasks are generally able to distinguish between different ability‐level students. However, some of the assumptions in the interpretive argument are not supported, such as the inability of the data to provide evidence that might neatly place students at a given level on our progressions. Implications for the assessment system, specifically, how responses are elicited from students, are discussed. In addition, we discuss the implications of our findings for defining and redesigning learning progressions. © 2013 Wiley Periodicals, Inc. J Res Sci Teach 50: 597–626, 2013.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here