z-logo
Premium
Multiple‐output quantile regression through optimal quantization
Author(s) -
Charlier Isabelle,
Paindaveine Davy,
Saracco Jérôme
Publication year - 2020
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/sjos.12426
Subject(s) - estimator , quantile , mathematics , kernel regression , kernel (algebra) , quantization (signal processing) , nonparametric regression , nonparametric statistics , quantile regression , statistics , econometrics , mathematical optimization , combinatorics
A new nonparametric quantile regression method based on the concept of optimal quantization was developed recently and was showed to provide estimators that often dominate their classical, kernel‐type, competitors. In the present work, we extend this method to multiple‐output regression problems. We show how quantization allows approximating population multiple‐output regression quantiles based on halfspace depth. We prove that this approximation becomes arbitrarily accurate as the size of the quantization grid goes to infinity. We also derive a weak consistency result for a sample version of the proposed regression quantiles. Through simulations, we compare the performances of our estimators with (local constant and local bilinear) kernel competitors. The results reveal that the proposed quantization‐based estimators, which are local constant in nature, outperform their kernel counterparts and even often dominate their local bilinear kernel competitors. The various approaches are also compared on artificial and real data.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here