z-logo
open-access-imgOpen Access
S24– Compatibility of AGREE and clinical experts review in guideline appraisal
Author(s) -
Kuo Ken N.,
Lo HengLien,
Chen Chiehfeng
Publication year - 2010
Publication title -
otolaryngology–head and neck surgery
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.232
H-Index - 121
eISSN - 1097-6817
pISSN - 0194-5998
DOI - 10.1016/j.otohns.2010.04.146
Subject(s) - lien , library science , guideline , citation , family medicine , medicine , political science , computer science , law
[[abstract]]BACKGROUND (INTRODUCTION):AGREE is the most accepted instrument in appraising the methodological quality of clinical practice guideline (CPG). Six domains measure different aspects of CPG quality and may differ from clinical expert perspective. LEARNING OBJECTIVES (TRAINING GOALS):1. To compare the result and compatibility of CPG appraisal between AGREE measures and clinical expert perspective. 2. To identify the inconsistent criteria in order to improve the consensus between AGREE reviewers and clinical specialty. METHODS:We collected data from independent evaluation by AGREE and related clinical expert on 17 CPGs developed from 2007 to 2008. For “strongly recommended” rating, we gave a score 3, “recommended with alteration” 2, and “not recommended” 1. The differences between AGREE and clinical expert's scores were expressed as sensitivity, specificity, and positive and negative predict value in relative intra- and inter-AGREE domains. RESULTS:Nine out of 17 CPGs showed similar recommendation between AGREE and clinical expert ratings. Four domains of AGREE were particularly sensitive to clinical expert perspective, including stakeholder involvement (sensitivity 0.89, specificity 0.75, PPV 0.80, NPV 0.86), rigor of development (0.89, 1.0, 1.0, 0.89), clarity and presentation (0.78, 0.88, 0.88, 0.78), and editorial independence (0.78, 1.0, 1.0, 0.80). The result is the same if we calculated those four sensitive AGREE domains and omitting other two. In consistency of items within each domain, the majority of items under “rigor of development,” “clarity and presentation,” and “editorial independence” showed relative high coherence. However, the consistency varied within ‘stakeholder involvement' domain.DISCUSSION (CONCLUSION):Our finding points out the compatibility between AGREE and clinical expert appraisal in CPG quality, and the predictability between the two. It is crucial to improve reviewer's training for enhancing inconsistent domains of AGREE

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here