z-logo
open-access-imgOpen Access
FURTHER VALIDATION OF A WRITING ASSESSMENT FOR GRADUATE ADMISSIONS
Author(s) -
Powers Donald E.,
Fowles Mary E.,
Welsh Cynthia K.
Publication year - 1999
Publication title -
ets research report series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.235
H-Index - 5
ISSN - 2330-8516
DOI - 10.1002/j.2333-8504.1999.tb01816.x
Subject(s) - generalizability theory , writing assessment , psychology , reliability (semiconductor) , academic writing , sample (material) , report writing , mathematics education , variety (cybernetics) , graduate students , computer science , pedagogy , artificial intelligence , developmental psychology , power (physics) , chemistry , physics , chromatography , quantum mechanics , library science
The objective of the study reported here was to collect validity evidence for a proposed writing assessment for graduate admissions – the Graduate Record Examinations (GRE ® ) Writing Assessment. In particular, the objective was to investigate the relationship between student performance on each of two exercises being considered for the assessment and several nontest indicators of writing skill and achievement, and to thereby establish the degree to which performance on the GRE Writing Assessment is related to writing performance in academic settings. A variety of nontest indicators were examined, but a particular focus was the quality of students' course‐related writing samples. Two such writing samples were collected for each participant in the study, as was considerable information about the nature of these samples. These data enabled an estimate of the reliability and generalizability of writing samples as a validity criterion. The data also permitted an analysis of the conditions and circumstances under which performance on course‐related writing assignments and performance on the two new writing exercises relate to one another, thus facilitating the interpretation of scores derived from the new writing assessment. The results revealed modest relationships between performance on the writing assessment essays and various nontest indicators of writing ability. Performance on the GRE Writing Assessment exhibited the strongest relationship with course‐related writing samples, arguably the most compelling of the nontest indicators. There was no indication that the relationship between the essays and course‐related writing samples may depend on particular characteristics of the sample.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here