
Bayesian variable selection in linear quantile mixed models for longitudinal data with application to macular degeneration
Author(s) -
Yonggang Ji,
Haifang Shi
Publication year - 2020
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0241197
Subject(s) - lasso (programming language) , bayesian probability , gibbs sampling , quantile regression , bayesian linear regression , prior probability , bayesian multivariate linear regression , quantile , cholesky decomposition , bayesian inference , markov chain monte carlo , covariate , linear model , feature selection , variable order bayesian network , computer science , mathematics , statistics , linear regression , artificial intelligence , world wide web , eigenvalues and eigenvectors , physics , quantum mechanics
This paper presents a Bayesian analysis of linear mixed models for quantile regression based on a Cholesky decomposition for the covariance matrix of random effects. We develop a Bayesian shrinkage approach to quantile mixed regression models using a Bayesian adaptive lasso and an extended Bayesian adaptive group lasso. We also consider variable selection procedures for both fixed and random effects in a linear quantile mixed model via the Bayesian adaptive lasso and extended Bayesian adaptive group lasso with spike and slab priors. To improve mixing of the Markov chains, a simple and efficient partially collapsed Gibbs sampling algorithm is developed for posterior inference. Simulation experiments and an application to the Age-Related Macular Degeneration Trial data to demonstrate the proposed methods.