z-logo
Premium
Bias modelling in evidence synthesis
Author(s) -
Turner Rebecca M.,
Spiegelhalter David J.,
Smith Gordon C. S.,
Thompson Simon G.
Publication year - 2009
Publication title -
journal of the royal statistical society: series a (statistics in society)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.103
H-Index - 84
eISSN - 1467-985X
pISSN - 0964-1998
DOI - 10.1111/j.1467-985x.2008.00547.x
Subject(s) - rigour , context (archaeology) , relevance (law) , publication bias , econometrics , excellence , variance (accounting) , psychology , computer science , statistics , cognitive psychology , meta analysis , medicine , mathematics , political science , economics , accounting , paleontology , geometry , law , biology
Summary.  Policy decisions often require synthesis of evidence from multiple sources, and the source studies typically vary in rigour and in relevance to the target question. We present simple methods of allowing for differences in rigour (or lack of internal bias) and relevance (or lack of external bias) in evidence synthesis. The methods are developed in the context of reanalysing a UK National Institute for Clinical Excellence technology appraisal in antenatal care, which includes eight comparative studies. Many were historically controlled, only one was a randomized trial and doses, populations and outcomes varied between studies and differed from the target UK setting. Using elicited opinion, we construct prior distributions to represent the biases in each study and perform a bias‐adjusted meta‐analysis. Adjustment had the effect of shifting the combined estimate away from the null by approximately 10%, and the variance of the combined estimate was almost tripled. Our generic bias modelling approach allows decisions to be based on all available evidence, with less rigorous or less relevant studies downweighted by using computationally simple methods.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here