z-logo
Premium
Bayesian Covariance Selection in Generalized Linear Mixed Models
Author(s) -
Cai Bo,
Dunson David B.
Publication year - 2006
Publication title -
biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.298
H-Index - 130
eISSN - 1541-0420
pISSN - 0006-341X
DOI - 10.1111/j.1541-0420.2005.00499.x
Subject(s) - cholesky decomposition , covariance , gibbs sampling , generalized linear mixed model , mathematics , random effects model , markov chain monte carlo , model selection , exponential family , bayesian probability , statistics , eigenvalues and eigenvectors , medicine , physics , meta analysis , quantum mechanics
Summary The generalized linear mixed model (GLMM), which extends the generalized linear model (GLM) to incorporate random effects characterizing heterogeneity among subjects, is widely used in analyzing correlated and longitudinal data. Although there is often interest in identifying the subset of predictors that have random effects, random effects selection can be challenging, particularly when outcome distributions are nonnormal. This article proposes a fully Bayesian approach to the problem of simultaneous selection of fixed and random effects in GLMMs. Integrating out the random effects induces a covariance structure on the multivariate outcome data, and an important problem that we also consider is that of covariance selection. Our approach relies on variable selection‐type mixture priors for the components in a special Cholesky decomposition of the random effects covariance. A stochastic search MCMC algorithm is developed, which relies on Gibbs sampling, with Taylor series expansions used to approximate intractable integrals. Simulated data examples are presented for different exponential family distributions, and the approach is applied to discrete survival data from a time‐to‐pregnancy study.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here