Premium
Dimension reduction and coefficient estimation in multivariate linear regression
Author(s) -
Yuan Ming,
Ekici Ali,
Lu Zhaosong,
Monteiro Renato
Publication year - 2007
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/j.1467-9868.2007.00591.x
Subject(s) - coefficient matrix , multivariate statistics , mathematics , dimension (graph theory) , dimensionality reduction , parametric statistics , mathematical optimization , least squares function approximation , sliced inverse regression , linear regression , computer science , statistics , regression , artificial intelligence , eigenvalues and eigenvectors , physics , quantum mechanics , estimator , pure mathematics
Summary We introduce a general formulation for dimension reduction and coefficient estimation in the multivariate linear model. We argue that many of the existing methods that are commonly used in practice can be formulated in this framework and have various restrictions. We continue to propose a new method that is more flexible and more generally applicable. The method proposed can be formulated as a novel penalized least squares estimate. The penalty that we employ is the coefficient matrix's Ky Fan norm. Such a penalty encourages the sparsity among singular values and at the same time gives shrinkage coefficient estimates and thus conducts dimension reduction and coefficient estimation simultaneously in the multivariate linear model. We also propose a generalized cross‐validation type of criterion for the selection of the tuning parameter in the penalized least squares. Simulations and an application in financial econometrics demonstrate competitive performance of the new method. An extension to the non‐parametric factor model is also discussed.