Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation from Incomplete Measurements

by   Tian Tong, et al.

Tensors, which provide a powerful and flexible model for representing multi-attribute data and multi-way interactions, play an indispensable role in modern data science across various fields in science and engineering. A fundamental task is to faithfully recover the tensor from highly incomplete measurements in a statistically and computationally efficient manner. Harnessing the low-rank structure of tensors in the Tucker decomposition, this paper develops a scaled gradient descent (ScaledGD) algorithm to directly recover the tensor factors with tailored spectral initializations, and shows that it provably converges at a linear rate independent of the condition number of the ground truth tensor for two canonical problems – tensor completion and tensor regression – as soon as the sample size is above the order of n^3/2 ignoring other dependencies, where n is the dimension of the tensor. This leads to an extremely scalable approach to low-rank tensor estimation compared with prior art, which suffers from at least one of the following drawbacks: extreme sensitivity to ill-conditioning, high per-iteration costs in terms of memory and computation, or poor sample complexity guarantees. To the best of our knowledge, ScaledGD is the first algorithm that achieves near-optimal statistical and computational complexities simultaneously for low-rank tensor completion with the Tucker decomposition. Our algorithm highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the underlying symmetry in low-rank tensor factorization.


page 1

page 2

page 3

page 4


Fast and Provable Tensor Robust Principal Component Analysis via Scaled Gradient Descent

An increasing number of data science and machine learning problems rely ...

Nonconvex Low-Rank Symmetric Tensor Completion from Noisy Data

We study a noisy symmetric tensor completion problem of broad practical ...

Optimal Low-Rank Tensor Recovery from Separable Measurements: Four Contractions Suffice

Tensors play a central role in many modern machine learning and signal p...

Nonconvex Robust High-Order Tensor Completion Using Randomized Low-Rank Approximation

Within the tensor singular value decomposition (T-SVD) framework, existi...

Deflated HeteroPCA: Overcoming the curse of ill-conditioning in heteroskedastic PCA

This paper is concerned with estimating the column subspace of a low-ran...

Tensor completion using geodesics on Segre manifolds

We propose a Riemannian conjugate gradient (CG) optimization method for ...

Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections

Sequential data such as time series, video, or text can be challenging t...

Please sign up or login with your details

Forgot password? Click here to reset