z-logo
Premium
ADMM for Penalized Quantile Regression in Big Data
Author(s) -
Yu Liqun,
Lin Nan
Publication year - 2017
Publication title -
international statistical review
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.051
H-Index - 54
eISSN - 1751-5823
pISSN - 0306-7734
DOI - 10.1111/insr.12221
Subject(s) - quantile regression , quantile , big data , computer science , linear programming , scale (ratio) , simplex algorithm , regression , mathematical optimization , algorithm , data mining , mathematics , machine learning , econometrics , statistics , physics , quantum mechanics
Summary Traditional linear programming algorithms for quantile regression, for example, the simplex method and the interior point method, work well for data of small to moderate sizes. However, these methods are difficult to generalize to high‐dimensional big data for which penalization is usually necessary. Further, the massive size of contemporary big data calls for the development of large‐scale algorithms on distributed computing platforms. The traditional linear programming algorithms are intrinsically sequential and not suitable for such frameworks. In this paper, we discuss how to use the popular ADMM algorithm to solve large‐scale penalized quantile regression problems. The ADMM algorithm can be easily parallelized and implemented in modern distributed frameworks. Simulation results demonstrate that the ADMM is as accurate as traditional LP algorithms while faster even in the nonparallel case.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here