z-logo
Premium
Development of workplace‐based assessments of non‐technical skills in anaesthesia *
Author(s) -
Crossingham G. V.,
Sice P. J. A.,
Roberts M. J.,
Lam W. H.,
Gale T. C. E.
Publication year - 2012
Publication title -
anaesthesia
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.839
H-Index - 117
eISSN - 1365-2044
pISSN - 0003-2409
DOI - 10.1111/j.1365-2044.2011.06977.x
Subject(s) - medicine , kappa , inter rater reliability , cohen's kappa , specialty , reliability (semiconductor) , selection (genetic algorithm) , physical therapy , medical education , medical physics , family medicine , statistics , computer science , philosophy , linguistics , rating scale , mathematics , power (physics) , physics , quantum mechanics , artificial intelligence
Summary Non‐technical skills are recognised as crucial to good anaesthetic practice. We designed and evaluated a specialty‐specific tool to assess non‐technical aspects of trainee performance in theatre, based on a system previously found reliable in a recruitment setting. We compared inter‐rater agreement (multir‐ater kappa) for live assessments in theatre with that in a selection centre and a video‐based rater training exercise. Twenty‐seven trainees participated in the first in‐theatre assessment round and 40 in the second. Round‐ 1 scores had poor inter‐rater agreement (mean kappa = 0.20) and low reliability (generalisability coefficient G = 0.50). A subsequent assessor training exercise showed good inter‐rater agreement, (mean kappa = 0.79) but did not improve performance of the assessment tool when used in round 2 (mean kappa = 0.14, G = 0.42). Inter‐rater agreement in two selection centres (mean kappa = 0.61 and 0.69) exceeded that found in theatre. Assessment tools that perform reliably in controlled settings may not do so in the workplace.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here