z-logo
Premium
Developing methods for systematic reviewing in health services delivery and organization: an example from a review of access to health care for people with learning disabilities. Part 2. Evaluation of the literature—a practical guide
Author(s) -
Alborz Alison,
McNally Rosalind
Publication year - 2004
Publication title -
health information and libraries journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.779
H-Index - 38
eISSN - 1471-1842
pISSN - 1471-1834
DOI - 10.1111/j.1471-1842.2004.00543.x
Subject(s) - rigour , computer science , toolbox , data extraction , quality (philosophy) , grey literature , inclusion (mineral) , health care , service delivery framework , knowledge management , set (abstract data type) , service (business) , medical education , management science , data science , psychology , medline , medicine , political science , law , social psychology , philosophy , geometry , mathematics , economy , epistemology , economics , programming language , economic growth
Objectives:  To develop methods to facilitate the ‘systematic’ review of evidence from a range of methodologies on diffuse or ‘soft’ topics, as exemplified by ‘access to health care’. Data sources:  Twenty‐eight bibliographic databases, research registers, organizational websites or library catalogues. Reference lists from identified studies. Contact with experts and service users. Current awareness and contents alerting services in the area of learning disabilities. Review methods:  Inclusion criteria were English language literature from 1980 onwards, relating to people with learning disabilities of any age and all study designs. The main criteria for assessment was relevance to Guillifords’ model of access to health care which was adapted to the circumstances of people with learning disabilities. Selected studies were evaluated for scientific rigour then data was extracted and the results synthesized. Quality assessment was by an initial set of ‘generic’ quality indicators. This enabled further evidence selection before evaluation of findings according to specific criteria for qualitative, quantitative or mixed‐method studies. Results:  Eighty‐two studies were fully evaluated. Five studies were rated ‘highly rigorous’, 22 ‘rigorous’, 46 ‘less rigorous’ and nine ‘poor’ papers were retained as the sole evidence covering aspects of the guiding model. The majority of studies were quantitative but used only descriptive statistics. Most evidence lacked methodological detail, which often lowered final quality ratings. Conclusions:  The application of a consistent structure to quality evaluation can facilitate data appraisal, extraction and synthesis across a range of methodologies in diffuse or ‘soft’ topics. Synthesis can be facilitated further by using software, such as the microsoft ‘access’ database, for managing information.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here