z-logo
Premium
Employment Interview Reliability: New meta‐analytic estimates by structure and format
Author(s) -
Huffcutt Allen I.,
Culbertson Satoris S.,
Weyhrauch William S.
Publication year - 2013
Publication title -
international journal of selection and assessment
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.812
H-Index - 61
eISSN - 1468-2389
pISSN - 0965-075X
DOI - 10.1111/ijsa.12036
Subject(s) - inter rater reliability , psychology , reliability (semiconductor) , perception , meta analysis , applied psychology , interview , sample (material) , social psychology , statistics , developmental psychology , rating scale , mathematics , medicine , chemistry , chromatography , law , power (physics) , physics , quantum mechanics , neuroscience , political science
This study sought to provide an update on evidence regarding the interrater reliability of employment interviews. Using a final dataset of 125 coefficients with a total sample size of 32,428, our results highlight the importance of taking all three sources of measurement error (random response, transient, and conspect) into account. For instance, the mean interrater reliability was considerably higher for panel interviews than for separate interviews conducted by different interviewers (.74 vs. .44). A strong implication of our findings is that interview professionals should not base perceptions of the psychometric properties of their interview process on interrater estimates that do not include all three sources. A number of directions for future research were identified, including the influence of cues in medium structure panel interviews (e.g., changes in tone or pitch) and the lower than expected reliability for highly structured interviews conducted separately by different interviewers.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here