z-logo
open-access-imgOpen Access
Utilization of a Think-Aloud Protocol to Cognitively Validate a Survey Instrument Identifying Social Capital Resources of Engineering Undergraduates
Author(s) -
Julie P. Martin,
Matthew K. Miller,
Kyle Gipson
Publication year - 2020
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/1-2--18492
Subject(s) - think aloud protocol , protocol analysis , protocol (science) , usability , computer science , psychology , applied psychology , variety (cybernetics) , cognition , human–computer interaction , artificial intelligence , medicine , cognitive science , alternative medicine , pathology , neuroscience
The use of verbal report (e.g. “think-aloud”) techniques in developing a survey instrument can be critical to establishing an instrument’s cognitive validity, which helps ensure that participants interpret and respond to survey items in the manner intended by the survey designer(s). The primary advantage of utilizing a verbal cognitive validation protocol is having evidence that survey items are interpreted by participants in the same way the researcher intended before the instrument is administered to a large sample. Think-aloud protocols have been used to accomplish different goals in a variety of fields, including engineering education where thinkalouds are commonly used in problem solving research. While think-alouds have been used by engineering education researchers, the engineering education literature includes few resources for researchers regarding the use of these protocols with respect to large-scale survey development and refinement. In this paper, we present a protocol based on elements of thinkalouds conducted inside and outside the engineering education domain. By presenting results and examples from our own experience suing this protocol, we aim to provide a cognitive validation model which may be useful to engineering education researchers designing their own survey instruments. By following the model outlined in this paper, participants in our study verbalized several issues of concern when interacting with our web-based survey. These issues ranged from minor grammatical errors to serious cognitive mismatches which caused participants to interpret and/or respond to items differently than we intended. Participants were asked for suggestions to correct these issues, and changes were made to the survey based on this feedback. The survey was retested in two additional iterations of think-aloud sessions with new participants to ensure the revisions successfully remedied the issues encountered by previous participants. Finally, the refined survey was pilot tested and subsequently reviewed by an expert in the field before being administered at seven institutions. This paper includes evidence and specific examples of how the cognitive validation model resulted in a refined survey instrument, as well as recommendations for other engineering education researchers wishing to employ similar techniques in designing and validating survey instruments. Introduction and Motivation Much of the extant literature surrounding the establishment of reliability and validity for survey instruments largely focuses predominately statistical methods to establish such measures as construct or internal consistency within an instrument, which requires the use of rigorous statistical methods to compute coefficients such as Cronbach’s alpha to verify that the instrument has achieved at least a minimum acceptable level of reliability and/or validity (e.g. Eris and colleagues). Such statistical methods can establish a case for whether or not the instrument consistently and appropriately measures participant responses to items by measuring a variety of constructs, which include (but is not limited to) ensuring the items within the instrument have an appropriate coverage of the relevant content, are scored or evaluated consistently, and/or are

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom