Premium
The Effects of Inattentive Responding on Construct Validity Evidence When Measuring Social–Emotional Learning Competencies
Author(s) -
Steedle Jeffrey T.,
Hong Maxwell,
Cheng Ying
Publication year - 2019
Publication title -
educational measurement: issues and practice
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.158
H-Index - 52
eISSN - 1745-3992
pISSN - 0731-1745
DOI - 10.1111/emip.12256
Subject(s) - psychology , construct validity , confirmatory factor analysis , social emotional learning , construct (python library) , concurrent validity , predictive validity , criterion validity , test validity , scale (ratio) , external validity , social psychology , developmental psychology , psychometrics , structural equation modeling , statistics , physics , mathematics , quantum mechanics , computer science , internal consistency , programming language
Self‐report inventories are commonly administered to measure social‐emotional learning competencies related to college and career readiness. Inattentive responding can negatively impact the validity of interpreting individual results and the accuracy of construct validity evidence. This study applied nine methods of detecting insufficient effort responding (IER) to a social‐emotional learning assessment. Individual methods identified between 0.9% and 20.3% of respondents as potentially exhibiting IER. Removing flagged respondents from the data resulted in negligible or small improvements in criterion‐related validity, coefficient alpha, concurrent validity, and confirmatory factor analysis model‐data fit. Implications for future validity studies and the operational use of IER detection for social–emotional learning assessments are discussed.