z-logo
Premium
Evaluation and inter‐observer analysis of retinography existing clinical classification system to categorize moderate retinopathies
Author(s) -
SANCHEZ RAMOS C,
WASFI M,
BONNINARIAS C,
VINASPENA M,
FORLAN A,
MOLINAGOMEZ M,
CHAMORRO E
Publication year - 2009
Publication title -
acta ophthalmologica
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.534
H-Index - 87
eISSN - 1755-3768
pISSN - 1755-375X
DOI - 10.1111/j.1755-3768.2009.422.x
Subject(s) - kappa , concordance , categorization , drusen , cohen's kappa , medicine , repeatability , grading (engineering) , inter rater reliability , statistics , reliability (semiconductor) , optometry , mathematics , artificial intelligence , ophthalmology , computer science , retinal , power (physics) , civil engineering , geometry , rating scale , physics , quantum mechanics , engineering
Purpose To evaluate existing validated systems to classify retinopathies, to determine the inter‐observer agreement level Methods 55 retinographies (both gender, >60 years old) were categorized following 3 established criteria: the International Classification and Grading System for ARM and AMD (BIRD), the Wisconsin Age‐related Maculophathy Grading System (WISCONSIN) and the Clinical Age‐Related Maculophathy Staging System (CARMS). The categorizing was made by 2 experts in a blind, independent way. The goal of the doubled‐classification‐method, with changing order no randomised, was to reject the influence of the Velo effect. Inter‐observer’s repeatability was checked as an agreement parameter between both Experts Results CARMS system was chosen, since allows a global pathology classification. CARMS concordance obtained was 81.89% (expected=57.39% Kappa index=0.57).Drusen concordance obtained was 87.27% (expected=60.31% Kappa index=0.68).Pigmentation concordance obtained was 76.36% (expected=54.46% Kappa index=0.48).The reliability measured by Kappa index followed the rule: 0.80‐1.00 (Excellent),0.60‐0.80 (Good) y 0.40‐0.60 (Moderate),0.20‐0.40 (Low),<0.20 (Bad) Conclusion Existing Retinopathies classification systems can be improved, to solve the huge subjective level that they present. Kappa index obtained, shown medium concordance for the inter‐observer analysis, except in Drusen case, dichotomy’s variable, which presented a good reliability. The worse behaviour of the Pigmentation’s variable respect to Drusen variable has to be noted. This has an important influence in the global CARMS reliability, so it makes this influence to fall down

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom