Premium
The inter‐rater and test–retest reliability of the Home Falls and Accidents Screening Tool
Author(s) -
Vu TuongVi,
Mackenzie Lynette
Publication year - 2012
Publication title -
australian occupational therapy journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.595
H-Index - 44
eISSN - 1440-1630
pISSN - 0045-0766
DOI - 10.1111/j.1440-1630.2012.01012.x
Subject(s) - kappa , medicine , inter rater reliability , reliability (semiconductor) , cohen's kappa , test (biology) , occupational safety and health , poison control , injury prevention , physical therapy , gerontology , psychology , environmental health , rating scale , statistics , paleontology , developmental psychology , philosophy , linguistics , power (physics) , physics , mathematics , pathology , quantum mechanics , biology
Background/aim The Home Falls and Accidents Screening Tool was developed to assist health professionals to identify falls risk among community‐dwelling older people arising from their home environment. The aim of this study was to evaluate the Home Falls and Accidents Screening Tool by examining its inter‐rater and test–retest reliability. Methods Community‐dwelling older people, over the age of 65 ( n = 31) were recruited from the caseload of nine occupational therapists across three area health services in Sydney and the Hunter region. A total of 31 home visits were conducted by the researcher and an occupational therapist to independently administer the Home Falls and Accidents Screening Tool. Follow‐up visits were then conducted within a two‐week period by one of the raters to re‐administer the Home Falls and Accidents Screening Tool. Reliability was evaluated using percentage agreement, intra‐class correlations and kappa coefficients. Results The intra‐class correlation coefficient for the Home Falls and Accidents Screening Tool overall score was 0.82 (95% CI , 0.66–0.91) for inter‐rater reliability and 0.77 (95% CI , 0.57–0.88) for test–retest reliability, indicating a good level of reliability for the tool. ‘Undefined stair edges’ was the only item that demonstrated poor inter‐rater reliability (kappa = 0.37). All items except ‘loose mats’ (kappa = 0.19) reached acceptable or excellent levels of test–retest reliability with kappa scores greater than 0.40. Conclusion The Home Falls and Accidents Screening Tool demonstrated consistency across raters and across different time periods. Further studies into the reliability of the Home Falls and Accidents Screening Tool would benefit from sampling raters from varying professional backgrounds and older people with higher levels of function.