Premium
Structured functional additive regression in reproducing kernel Hilbert spaces
Author(s) -
Zhu Hongxiao,
Yao Fang,
Zhang Hao Helen
Publication year - 2014
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/rssb.12036
Subject(s) - functional principal component analysis , reproducing kernel hilbert space , hilbert space , additive model , kernel (algebra) , functional data analysis , regularization (linguistics) , mathematics , principal component regression , mathematical optimization , rate of convergence , context (archaeology) , kernel principal component analysis , computer science , principal component analysis , algorithm , kernel method , artificial intelligence , econometrics , statistics , support vector machine , mathematical analysis , pure mathematics , computer network , channel (broadcasting) , paleontology , biology
Summary Functional additive models provide a flexible yet simple framework for regressions involving functional predictors. The utilization of a data‐driven basis in an additive rather than linear structure naturally extends the classical functional linear model. However, the critical issue of selecting non‐linear additive components has been less studied. In this work, we propose a new regularization framework for structure estimation in the context of reproducing kernel Hilbert spaces. The approach proposed takes advantage of functional principal components which greatly facilitates implementation and theoretical analysis. The selection and estimation are achieved by penalized least squares using a penalty which encourages the sparse structure of the additive components. Theoretical properties such as the rate of convergence are investigated. The empirical performance is demonstrated through simulation studies and a real data application.