Premium
A note on sliced inverse regression with missing predictors
Author(s) -
Dong Yuexiao,
Zhu Liping
Publication year - 2012
Publication title -
statistical analysis and data mining: the asa data science journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.381
H-Index - 33
eISSN - 1932-1872
pISSN - 1932-1864
DOI - 10.1002/sam.10132
Subject(s) - sliced inverse regression , collinearity , dimensionality reduction , missing data , sufficient dimension reduction , mathematics , curse of dimensionality , inverse , context (archaeology) , covariance , covariance matrix , dimension (graph theory) , statistics , regression , regression analysis , computer science , algorithm , artificial intelligence , geometry , pure mathematics , paleontology , biology
Sufficient dimension reduction (SDR) is effective in high‐dimensional data analysis as it mitigates the curse of dimensionality while retaining full regression information. Missing predictors are common in high‐dimensional data, yet are only discussed occasionally in the SDR context. In this paper, an inverse probability weighted sliced inverse regression (SIR) is studied with predictors missing at random. We cast SIR into the estimating equation framework to avoid inverting a large scale covariance matrix. This strategy is more efficient in handling large dimensionality and strong collinearity among the predictors than the spectral decomposition of classical SIR. Numerical studies confirm the supremacy of our proposed procedure over existing methods. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining, 2011