z-logo
Premium
Regression model selection—a residual likelihood approach
Author(s) -
Shi Peide,
Tsai ChihLing
Publication year - 2002
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/1467-9868.00335
Subject(s) - akaike information criterion , bayesian information criterion , statistics , mathematics , information criteria , deviance information criterion , model selection , autoregressive model , residual , regression analysis , regression , sample size determination , bayesian probability , bayesian inference , algorithm
Summary. We obtain the residual information criterion RIC, a selection criterion based on the residual log‐likelihood, for regression models including classical regression models, Box–Cox transformation models, weighted regression models and regression models with autoregressive moving average errors. We show that RIC is a consistent criterion, and that simulation studies for each of the four models indicate that RIC provides better model order choices than the Akaike information criterion, corrected Akaike information criterion, final prediction error, C p and R adj 2 , except when the sample size is small and the signal‐to‐noise ratio is weak. In this case, none of the criteria performs well. Monte Carlo results also show that RIC is superior to the consistent Bayesian information criterion BIC when the signal‐to‐noise ratio is not weak, and it is comparable with BIC when the signal‐to‐noise ratio is weak and the sample size is large.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here