Premium
Expert and Target Scoring: Their relation, corresponding test instructions, and their effects on the construct validity of the video‐based social understanding test ( VSU )
Author(s) -
Conzelmann Kristin,
Goerke Panja
Publication year - 2015
Publication title -
international journal of selection and assessment
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.812
H-Index - 61
eISSN - 1468-2389
pISSN - 0965-075X
DOI - 10.1111/ijsa.12090
Subject(s) - psychology , construct validity , test (biology) , construct (python library) , convergent validity , cognition , observer (physics) , test validity , psychometrics , applied psychology , social psychology , clinical psychology , computer science , paleontology , physics , quantum mechanics , neuroscience , internal consistency , biology , programming language
This study investigated the relation between expert and target scoring of a video‐based social understanding test ( VSU ) under two different types of instructions (internal and observer). The effects of the scoring methods and instructions on the VSU 's construct validity were also examined. A total of 529 pilot applicants completed the VSU (some with internal and some with observer instructions), cognitive ability and knowledge tests, and a personality questionnaire. A subsample ( n = 132) completed the VSU again with the other instructions and participated in an assessment center ( AC ). The two scores were moderately correlated; correlations decreased when the instructions were considered. Neither expert nor target scores showed convergent validity with AC variables; none of the scoring‐instruction combinations showed significant associations with the remaining measures.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom