z-logo
Premium
An imputation–regularized optimization algorithm for high dimensional missing data problems and beyond
Author(s) -
Liang Faming,
Jia Bochao,
Xue Jingnan,
Li Qizhai,
Luo Ye
Publication year - 2018
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/rssb.12279
Subject(s) - missing data , imputation (statistics) , algorithm , expectation–maximization algorithm , regularization (linguistics) , computer science , optimization problem , gaussian , mathematics , mathematical optimization , maximum likelihood , statistics , artificial intelligence , machine learning , physics , quantum mechanics
Missing data are frequently encountered in high dimensional problems, but they are usually difficult to deal with by using standard algorithms, such as the expectation–maximization algorithm and its variants. To tackle this difficulty, some problem‐specific algorithms have been developed in the literature, but there still lacks a general algorithm. This work is to fill the gap: we propose a general algorithm for high dimensional missing data problems. The algorithm works by iterating between an imputation step and a regularized optimization step. At the imputation step, the missing data are imputed conditionally on the observed data and the current estimates of parameters and, at the regularized optimization step, a consistent estimate is found via the regularization approach for the minimizer of a Kullback–Leibler divergence defined on the pseudocomplete data. For high dimensional problems, the consistent estimate can be found under sparsity constraints. The consistency of the averaged estimate for the true parameter can be established under quite general conditions. The algorithm is illustrated by using high dimensional Gaussian graphical models, high dimensional variable selection and a random‐coefficient model.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here