Premium
Response and predictor folding to counter symmetric dependency in dimension reduction
Author(s) -
Prendergast L.A.,
Garnham A.L.
Publication year - 2016
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/anzs.12170
Subject(s) - sliced inverse regression , dimensionality reduction , sufficient dimension reduction , mathematics , reduction (mathematics) , dependency (uml) , dimension (graph theory) , ordinary least squares , simple (philosophy) , least squares function approximation , mathematical optimization , inverse , regression , simple linear regression , dimensional reduction , algorithm , regression analysis , statistics , computer science , artificial intelligence , combinatorics , geometry , philosophy , epistemology , estimator , mathematical physics
Summary In the regression setting, dimension reduction allows for complicated regression structures to be detected via visualisation in a low‐dimensional framework. However, some popular dimension reduction methodologies fail to achieve this aim when faced with a problem often referred to as symmetric dependency. In this paper we show how vastly superior results can be achieved when carrying out response and predictor transformations for methods such as least squares and sliced inverse regression. These transformations are simple to implement and utilise estimates from other dimension reduction methods that are not faced with the symmetric dependency problem. We highlight the effectiveness of our approach via simulation and an example. Furthermore, we show that ordinary least squares can effectively detect multiple dimension reduction directions. Methods robust to extreme response values are also considered.