z-logo
open-access-imgOpen Access
ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
Author(s) -
Anru R. Zhang,
Yuetian Luo,
Garvesh Raskutti,
Ming Yuan
Publication year - 2020
Publication title -
siam journal on mathematics of data science
Language(s) - English
Resource type - Journals
ISSN - 2577-0187
DOI - 10.1137/19m126476x
Subject(s) - minimax , rank (graph theory) , tensor (intrinsic definition) , dimension (graph theory) , mean squared error , gaussian , algorithm , regression , mathematics , computer science , mathematical optimization , statistics , combinatorics , geometry , physics , quantum mechanics
In this paper, we develop a novel procedure for low-rank tensor regression, namely \underline{I}mportance \underline{S}ketching \underline{L}ow-rank \underline{E}stimation for \underline{T}ensors (ISLET). The central idea behind ISLET is \emph{importance sketching}, i.e., carefully designed sketches based on both the responses and low-dimensional structure of the parameter of interest. We show that the proposed method is sharply minimax optimal in terms of the mean-squared error under low-rank Tucker assumptions and under randomized Gaussian ensemble design. In addition, if a tensor is low-rank with group sparsity, our procedure also achieves minimax optimality. Further, we show through numerical studies that ISLET achieves comparable or better mean-squared error performance to existing state-of-the-art methods whilst having substantial storage and run-time advantages including capabilities for parallel and distributed computing. In particular, our procedure performs reliable estimation with tensors of dimension $p = O(10^8)$ and is $1$ or $2$ orders of magnitude faster than baseline methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom