Premium
Empirical Limits for Time Series Econometric Models
Author(s) -
Ploberger Werner,
Phillips Peter C. B.
Publication year - 2003
Publication title -
econometrica
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 16.7
H-Index - 199
eISSN - 1468-0262
pISSN - 0012-9682
DOI - 10.1111/1468-0262.00419
Subject(s) - mathematics , series (stratigraphy) , logarithm , upper and lower bounds , cointegration , measure (data warehouse) , econometrics , akaike information criterion , curse of dimensionality , fisher information , statistics , time series , computer science , mathematical analysis , paleontology , database , biology
This paper characterizes empirically achievable limits for time series econometric modeling and forecasting. The approach involves the concept of minimal information loss in time series regression and the paper shows how to derive bounds that delimit the proximity of empirical measures to the true probability measure (the DGP) in models that are of econometric interest. The approach utilizes joint probability measures over the combined space of parameters and observables and the results apply for models with stationary, integrated, and cointegrated data. A theorem due to Rissanen is extended so that it applies directly to probabilities about the relative likelihood (rather than averages), a new way of proving results of the Rissanen type is demonstrated, and the Rissanen theory is extended to nonstationary time series with unit roots, near unit roots, and cointegration of unknown order. The corresponding bound for the minimal information loss in empirical work is shown not to be a constant, in general, but to be proportional to the logarithm of the determinant of the (possibility stochastic) Fisher–information matrix. In fact, the bound that determines proximity to the DGP is generally path dependent, and it depends specifically on the type as well as the number of regressors. For practical purposes, the proximity bound has the asymptotic form ( K /2)log n , where K is a new dimensionality factor that depends on the nature of the data as well as the number of parameters in the model. When ‘good’ model selection principles are employed in modeling time series data, we are able to show that our proximity bound quantifies empirical limits even in situations where the models may be incorrectly specified. One of the main implications of the new result is that time trends are more costly than stochastic trends, which are more costly in turn than stationary regressors in achieving proximity to the true density. Thus, in a very real sense and quantifiable manner, the DGP is more elusive when there is nonstationarity in the data. The implications for prediction are explored and a second proximity theorem is given, which provides a bound that measures how close feasible predictors can come to the optimal predictor. Again, the bound has the asymptotic form ( K /2)log n , showing that forecasting trends is fundamentally more difficult than forecasting stationary time series, even when the correct form of the model for the trends is known.