z-logo
Premium
Maximum regularized likelihood estimators: A general prediction theory and applications
Author(s) -
Zhuang Rui,
Lederer Johannes
Publication year - 2018
Publication title -
stat
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.61
H-Index - 18
ISSN - 2049-1573
DOI - 10.1002/sta4.186
Subject(s) - estimator , mathematics , divergence (linguistics) , regularization (linguistics) , regular polygon , measure (data warehouse) , statistics , computer science , artificial intelligence , database , philosophy , linguistics , geometry
Maximum regularized likelihood estimators (MRLEs) are arguably the most established class of estimators in high‐dimensional statistics. In this paper, we derive guarantees for MRLEs in the Kullback–Leibler divergence, a general measure of prediction accuracy. We assume only that the densities have a convex parametrization and that the regularization is definite and positive homogenous. The results thus apply to a very large variety of models and estimators, such as tensor regression and graphical models with convex and non‐convex regularized methods. A main conclusion is that MRLEs are broadly consistent in prediction—regardless of whether restricted eigenvalues or similar conditions hold. Copyright © 2018 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here