
Comparison Some Robust Regularization Methods in Linear Regression via Simulation Study
Author(s) -
Sherzad Muhammed Ajeel,
Hussein Hashem
Publication year - 2020
Publication title -
academic journal of nawroz university
Language(s) - English
Resource type - Journals
ISSN - 2520-789X
DOI - 10.25007/ajnu.v9n2a818
Subject(s) - lasso (programming language) , outlier , quantile regression , robust regression , ordinary least squares , mathematics , linear regression , feature selection , coordinate descent , regression , statistics , econometrics , computer science , mathematical optimization , artificial intelligence , world wide web
In this paper, we reviewed some variable selection methods in linear regression model. Conventional methodologies such as the Ordinary Least Squares (OLS) technique is one of the most commonly used method in estimating the parameters in linear regression. But the OLS estimates performs poorly when the dataset suffer from outliers or when the assumption of normality is violated such as in the case of heavy-tailed errors. To address this problem, robust regularized regression methods like Huber Lasso (Rosset and Zhu, 2007) and quantile regression (Koenker and Bassett ,1978] were proposed. This paper focuses on comparing the performance of the seven methods, the quantile regression estimates, the Huber Lasso estimates, the adaptive Huber Lasso estimates, the adaptive LAD Lasso, the Gamma-divergence estimates, the Maximum Tangent Likelihood Lasso (MTE) estimates and Semismooth Newton Coordinate Descent Algorithm (SNCD ) Huber loss estimates.