
Robust variable selection in sliced inverse regression using Tukey’s biweight criterion and ball covariance
Author(s) -
Ali Alkenani
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1664/1/012034
Subject(s) - mathematics , outlier , statistics , covariance , mean squared error , feature selection , covariate , inverse , bayesian information criterion , robustness (evolution) , estimation of covariance matrices , regression , computer science , artificial intelligence , biochemistry , chemistry , geometry , gene
The shrinkage sliced inverse (SSIR) is a variable selection method under the settings of sufficient dimension reduction (SDR) theory. The SSIR merges the ideas of Lasso shrinkage and sliced inverse regression (SIR) to obtain sparse and accurate solutions. However, the dependency of SSIR on squared loss function and classical estimates for location and dispersion measures make it very sensitive to outliers. In this paper, a robust variable selection method based on SSIR, which is called RSSIR, is proposed. The squared loss is replaced by Tukey’s biweight criterion. Also, the classical estimates of the mean and covariance matrix are replaced with the median and ball covariance, which are robust measures for location and dispersion, respectively. In both the response and covariates, the proposed RSSIR is resistant to outliers. In addition, a robust version of the residual information criterion (RIC) is proposed to select the regularisation parameter. Depending on the results of simulations and real data analysis, very reliable results are achieved through RSSIR. In the presence of outliers, the performance of RSSIR is significantly better than the performance of SSIR and other existing methods.