Premium
A Practitioner's Guide to Computation and Interpretation of Reliability Indices for Mastery Tests
Author(s) -
Subkoviak Michael J.
Publication year - 1988
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/j.1745-3984.1988.tb00290.x
Subject(s) - reliability (semiconductor) , kappa , cohen's kappa , test (biology) , interpretation (philosophy) , computer science , computation , statistics , reliability engineering , medical physics , mathematics , mathematics education , algorithm , medicine , programming language , engineering , paleontology , power (physics) , physics , geometry , quantum mechanics , biology
From the perspective of teachers and test makers at the district or state level, current methods for obtaining reliability indices for mastery tests like the agreement coefficient and kappa coefficient are quite laborious. For example, some methods require two test administrations, whereas single administration approaches involve complex statistical procedures and require access to appropriate computer software. The present paper offers practitioners tables from which agreement and kappa coefficients can be read directly. Further‐more, because these indices differ from traditional reliability coefficients, the issue of what constitutes acceptable values of agreement and kappa coefficients is also addressed