z-logo
Premium
Shrinkage inverse regression estimation for model‐free variable selection
Author(s) -
Bondell Howard D.,
Li Lexin
Publication year - 2009
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/j.1467-9868.2008.00686.x
Subject(s) - sliced inverse regression , estimator , sufficient dimension reduction , shrinkage , dimension (graph theory) , consistency (knowledge bases) , dimensionality reduction , feature selection , lasso (programming language) , mathematics , regression analysis , regression , statistics , shrinkage estimator , variable (mathematics) , reduction (mathematics) , selection (genetic algorithm) , estimation , inverse , computer science , artificial intelligence , efficient estimator , engineering , mathematical analysis , geometry , systems engineering , minimum variance unbiased estimator , world wide web , pure mathematics
Summary.  The family of inverse regression estimators that was recently proposed by Cook and Ni has proven effective in dimension reduction by transforming the high dimensional predictor vector to its low dimensional projections. We propose a general shrinkage estimation strategy for the entire inverse regression estimation family that is capable of simultaneous dimension reduction and variable selection. We demonstrate that the new estimators achieve consistency in variable selection without requiring any traditional model, meanwhile retaining the root n estimation consistency of the dimension reduction basis. We also show the effectiveness of the new estimators through both simulation and real data analysis.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here