Premium
High‐dimensional covariance matrix estimation using a low‐rank and diagonal decomposition
Author(s) -
Wu Yilei,
Qin Yingli,
Zhu Mu
Publication year - 2020
Publication title -
canadian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.804
H-Index - 51
eISSN - 1708-945X
pISSN - 0319-5724
DOI - 10.1002/cjs.11532
Subject(s) - mathematics , covariance matrix , estimator , estimation of covariance matrices , covariance , rank (graph theory) , covariance function , consistency (knowledge bases) , mathematical optimization , statistics , algorithm , combinatorics , geometry
We study high‐dimensional covariance/precision matrix estimation under the assumption that the covariance/precision matrix can be decomposed into a low‐rank component L and a diagonal component D . The rank of L can either be chosen to be small or controlled by a penalty function. Under moderate conditions on the population covariance/precision matrix itself and on the penalty function, we prove some consistency results for our estimators. A block‐wise coordinate descent algorithm, which iteratively updates L and D , is then proposed to obtain the estimator in practice. Finally, various numerical experiments are presented; using simulated data, we show that our estimator performs quite well in terms of the Kullback–Leibler loss; using stock return data, we show that our method can be applied to obtain enhanced solutions to the Markowitz portfolio selection problem. The Canadian Journal of Statistics 48: 308–337; 2020 © 2019 Statistical Society of Canada
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom