z-logo
open-access-imgOpen Access
Test-retest reliability and rater agreements of Assessment of Capacity for Myoelectric Control version 2.0
Author(s) -
Helen Lindner,
Ann LangiusEklöf,
Liselotte Hermansson
Publication year - 2014
Publication title -
the journal of rehabilitation research and development
Language(s) - English
Resource type - Journals
eISSN - 1938-1352
pISSN - 0748-7711
DOI - 10.1682/jrrd.2013.09.0197
Subject(s) - intraclass correlation , inter rater reliability , kappa , limits of agreement , reliability (semiconductor) , standard error , physical therapy , cohen's kappa , intra rater reliability , rating of perceived exertion , psychology , reproducibility , medicine , mathematics , physical medicine and rehabilitation , confidence interval , statistics , rating scale , nuclear medicine , power (physics) , physics , quantum mechanics , geometry , heart rate , blood pressure , radiology
The Assessment of Capacity for Myoelectric Control (ACMC) is an observation-based tool that evaluates ability to control a myoelectric prosthetic hand. Validity evidence led to ACMC version 2.0, but the test-retest reliability and minimal detectable change (MDC) of the ACMC have never been evaluated. Investigation of rater agreements in this version was also needed because it has new definitions in certain rating categories and items. Upper-limb prosthesis users (n = 25, 15 congenital, 10 acquired; mean age 27.5 yr) performed one standardized activity twice, 2 to 5 wk apart. Activity performances were videorecorded and assessed by two ACMC raters. Data were analyzed by weighted kappa, intraclass correlation coefficient (ICC), and Bland-Altman method. For test-retest reliability, weighted kappa agreements were fair to excellent (0.52 to 1.00), ICC2,1 was 0.94, and one user was located outside the limits of agreement in the Bland-Altman plot. MDC95 was less than or equal to 0.55 logits (1 rater) and 0.69 logits (2 raters). For interrater reliability, weighted kappa agreements were fair to excellent in both sessions (0.44 to 1.00), and ICC2,1 was 0.95 (test) and 0.92 (retest). Intrarater agreement (rater 1) was also excellent (ICC3,1 0.98). Evidence regarding the reliability of the ACMC is satisfactory and MDC95 can be used to indicate change.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom