Premium
Dimension reduction for the conditional mean and variance functions in time series
Author(s) -
Park JinHong,
Samadi S. Yaser
Publication year - 2020
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/sjos.12405
Subject(s) - mathematics , conditional expectation , estimator , conditional variance , univariate , statistics , series (stratigraphy) , nonparametric statistics , variance reduction , consistency (knowledge bases) , variance (accounting) , conditional probability distribution , dimension (graph theory) , variance function , econometrics , multivariate statistics , monte carlo method , autoregressive conditional heteroskedasticity , volatility (finance) , paleontology , geometry , accounting , pure mathematics , business , biology
Abstract This paper deals with the nonparametric estimation of the mean and variance functions of univariate time series data. We propose a nonparametric dimension reduction technique for both mean and variance functions of time series. This method does not require any model specification and instead we seek directions in both the mean and variance functions such that the conditional distribution of the current observation given the vector of past observations is the same as that of the current observation given a few linear combinations of the past observations without loss of inferential information. The directions of the mean and variance functions are estimated by maximizing the Kullback–Leibler distance function. The consistency of the proposed estimators is established. A computational procedure is introduced to detect lags of the conditional mean and variance functions in practice. Numerical examples and simulation studies are performed to illustrate and evaluate the performance of the proposed estimators.