Premium
Unified Parkinson's disease rating scale motor examination: Are ratings of nurses, residents in neurology, and movement disorders specialists interchangeable?
Author(s) -
Post Bart,
Merkus Maruschka P.,
de Bie Rob M.A.,
de Haan Rob J.,
Speelman Johannes D.
Publication year - 2005
Publication title -
movement disorders
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.352
H-Index - 198
eISSN - 1531-8257
pISSN - 0885-3185
DOI - 10.1002/mds.20640
Subject(s) - neurology , rating scale , movement disorders , physical medicine and rehabilitation , parkinson's disease , medicine , motor symptoms , physical therapy , psychology , disease , psychiatry , pathology , developmental psychology
The Unified Parkinson's Disease Rating Scale (UPDRS) is widely used for the clinical evaluation of Parkinson's disease (PD). We assessed the rater variability of the UPDRS Motor examination (UPDRS‐ME) of nurse practitioners, residents in neurology, and a movement disorders specialist (MDS) compared to a senior MDS. We assessed the videotaped UPDRS‐ME of 50 PD patients. Inter‐rater and intra‐rater variability were estimated using weighted kappa (κ w ) and intraclass correlation coefficients (ICC). Additionally, inter‐rater agreement was quantified by calculation of the mean difference between 2 raters and its 95% limits of agreement. Intra‐rater agreement was also estimated by calculation of a 95% repeatability limits. The κ w and ICC statistics indicated good to very good inter‐rater and intra‐rater reliability for the majority of individual UPDRS items and the sum score of the UPDRS‐ME in all raters. However, for inter‐rater agreement, it appeared that both nurses, residents, and the MDS consistently assigned higher scores than the senior MDS. Mean differences ranged between 1.7 and 5.4 (all differences P < 0.05), with rather wide 95% limits of agreement. The intra‐rater 95% repeatability limits were rather wide. We found considerable rater difference for the whole range of UPDRS‐ME scores between a senior MDS and nurse practitioners, residents in neurology, and the MDS. This finding suggests that the amount by which raters may disagree should be quantified before starting longitudinal studies of disease progression or clinical trials. Finally, evaluation of rater agreement should always include the assessment of the extent of bias between different raters. © 2005 Movement Disorder Society