z-logo
Premium
A review of some recent developments in robust regression
Author(s) -
Wilcox Rand R.
Publication year - 1996
Publication title -
british journal of mathematical and statistical psychology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.157
H-Index - 51
eISSN - 2044-8317
pISSN - 0007-1102
DOI - 10.1111/j.2044-8317.1996.tb01088.x
Subject(s) - homoscedasticity , robust regression , heteroscedasticity , ordinary least squares , outlier , term (time) , mathematics , computer science , econometrics , least squares function approximation , statistics , physics , quantum mechanics , estimator
In situations where the goal is to understand how a random variable y is related to a set of p predictor variables, modern robust regression methods can be invaluable. One reason is that even one unusual value in the design space, or one outlier among the y values, can have a large impact on the ordinary least squares estimate of the parameters of the usual linear model. That is, a single unusual value or outlier can give a highly distorted view of how two or more random variables are related. Another reason is that modern robust methods can be much more efficient than ordinary least squares yet maintain good efficiency under the ideal conditions of normality and a homoscedastic error term. Even when sampling is from light‐tailed distributions, there are situations where certain robust methods are highly efficient compared to least squares, as is indicated in this paper. Most applied researchers in psychology simply ignore these problems. In the hope of improving current practice, this paper reviews some of the robust methods currently available with an emphasis on recent developments. Of particular interest are methods for computing confidence intervals and dealing with heteroscedasticity in the error term.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here