z-logo
Premium
When is the Same Not the Same? Issues of Question Equivalence in Online Exam Pools
Author(s) -
Leupen Sarah,
Hodges Linda C.,
Bass Sarah,
Carpenter Tara,
GoolsbyCole Cody,
Stanwyck Liz
Publication year - 2022
Publication title -
the faseb journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.709
H-Index - 277
eISSN - 1530-6860
pISSN - 0892-6638
DOI - 10.1096/fasebj.2022.36.s1.r1938
Subject(s) - cheating , globe , mathematics education , test (biology) , psychology , equity (law) , online learning , computer science , ecology , social psychology , world wide web , biology , political science , neuroscience , law
During the pandemic, the use of question pools for online testing was recommended to mitigate cheating, exposing multitudes of STEM students across the globe to this practice. Yet little systematic analysis of the practice apparently exists. In this study, we undertook an investigation of student performance on our questions in online exam pools across several STEM courses: upper‐level physiology, general chemistry, and introductory physics. We found that the difficulty of creating analogous questions in a pool varied by question type, with quantitative problems being the easiest to vary without altering average student performance. However, when instructors created pools by rearranging aspects of a question, posing opposite counterparts of concepts, or formulating questions assessing the same learning objective, we sometimes discovered student learning differences between seemingly closely‐related ideas, illustrating the challenge of our own expert blind spot. We provide suggestions for instructors on steps to take to improve the equity of question pools, such as being cautious in how many variables one changes in a specific pool and “test driving” proposed questions in lower stakes assessments.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here