z-logo
open-access-imgOpen Access
Usability of Interactive Item Types and Tools Introduced in the New GRE ® revised General Test
Author(s) -
Swiggett Wanda D.,
Kotloff Laurie,
Ezzo Chelsea,
Adler Rachel,
Oliveri Maria Elena
Publication year - 2014
Publication title -
ets research report series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.235
H-Index - 5
ISSN - 2330-8516
DOI - 10.1002/ets2.12028
Subject(s) - test (biology) , calculator , usability , notice , think aloud protocol , psychology , construct (python library) , variance (accounting) , computer science , computerized adaptive testing , mathematics education , applied psychology , human–computer interaction , psychometrics , paleontology , clinical psychology , accounting , political science , law , business , biology , programming language , operating system
The computer‐based Graduate Record Examinations ® ( GRE ® ) revised General Test includes interactive item types and testing environment tools (e.g., test navigation, on‐screen calculator, and help). How well do test takers understand these innovations? If test takers do not understand the new item types, these innovations may introduce construct‐irrelevant variance, with test takers performing differently than they would with more familiar item types. Similarly, the navigational and other test environment tools are another potential source of variance, if some test takers understand how to use them and others do not. In this study, we examined the reactions, engagement, and difficulties encountered as 20 potential test takers completed Verbal and Quantitative Reasoning sections of a practice GRE test. Participants were sophomores and juniors from colleges and universities in the local area. Their reactions were captured through the use of cognitive laboratory sessions that incorporated interviews that required test takers to think aloud, as well as researcher observations as test takers worked quietly. Results of the analysis of this data revealed that some participants needed time to figure out what was being asked of them when they encountered the new item types, although most were able to answer each item eventually. On the other hand, most participants stated that they did not even notice the test environment tools, and few were observed actually using the tools. Several participants provided suggestions about improving the usability of the new item types and test environment tools.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here