Premium
Estimation of stationary autoregressive models with the Bayesian LASSO
Author(s) -
Schmidt Daniel F.,
Makalic Enes
Publication year - 2013
Publication title -
journal of time series analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.576
H-Index - 54
eISSN - 1467-9892
pISSN - 0143-9782
DOI - 10.1111/jtsa.12027
Subject(s) - autoregressive model , lasso (programming language) , mathematics , gibbs sampling , bayesian probability , model selection , bayes factor , prior probability , star model , posterior probability , bayesian information criterion , marginal likelihood , bayes' theorem , statistics , autoregressive integrated moving average , computer science , time series , world wide web
This article explores the problem of estimating stationary autoregressive models from observed data using the Bayesian least absolute shrinkage and selection operator (LASSO). By characterizing the model in terms of partial autocorrelations, rather than coefficients, it becomes straightforward to guarantee that the estimated models are stationary. The form of the negative log‐likelihood is exploited to derive simple expressions for the conditional likelihood functions, leading to efficient algorithms for computing the posterior mode by coordinate‐wise descent and exploring the posterior distribution by Gibbs sampling. Both empirical Bayes and Bayesian methods are proposed for the estimation of the LASSO hyper‐parameter from the data. Simulations demonstrate that the Bayesian LASSO performs well in terms of prediction when compared with a standard autoregressive order selection method.