z-logo
Premium
Bayesian Inference on Order‐Constrained Parameters in Generalized Linear Models
Author(s) -
Dunson David B.,
Neelon Brian
Publication year - 2003
Publication title -
biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.298
H-Index - 130
eISSN - 1541-0420
pISSN - 0006-341X
DOI - 10.1111/1541-0420.00035
Subject(s) - gibbs sampling , mathematics , categorical variable , inference , covariate , generalized linear model , bayesian inference , computer science , bayes' theorem , monotonic function , bayesian probability , statistics , mathematical optimization , artificial intelligence , mathematical analysis
Summary In biomedical studies, there is often interest in assessing the association between one or more ordered categorical predictors and an outcome variable, adjusting for covariates. For a k ‐level predictor, one typically uses either a k  − 1 degree of freedom (df) test or a single df trend test, which requires scores for the different levels of the predictor. In the absence of knowledge of a parametric form for the response function, one can incorporate monotonicity constraints to improve the efficiency of tests of association. This article proposes a general Bayesian approach for inference on order‐constrained parameters in generalized linear models. Instead of choosing a prior distribution with support on the constrained space, which can result in major computational difficulties, we propose to map draws from an unconstrained posterior density using an isotonic regression transformation. This approach allows flat regions over which increases in the level of a predictor have no effect. Bayes factors for assessing ordered trends can be computed based on the output from a Gibbs sampling algorithm. Results from a simulation study are presented and the approach is applied to data from a time‐to‐pregnancy study.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here