Tensor Completion Made Practical

06/04/2020
by   Allen Liu, et al.
0

Tensor completion is a natural higher-order generalization of matrix completion where the goal is to recover a low-rank tensor from sparse observations of its entries. Existing algorithms are either heuristic without provable guarantees, based on solving large semidefinite programs which are impractical to run, or make strong assumptions such as requiring the factors to be nearly orthogonal. In this paper we introduce a new variant of alternating minimization, which in turn is inspired by understanding how the progress measures that guide convergence of alternating minimization in the matrix setting need to be adapted to the tensor setting. We show strong provable guarantees, including showing that our algorithm converges linearly to the true tensors even when the factors are highly correlated and can be implemented in nearly linear time. Moreover our algorithm is also highly practical and we show that we can complete third order tensors with a thousand dimensions from observing a tiny fraction of its entries. In contrast, and somewhat surprisingly, we show that the standard version of alternating minimization, without our new twist, can converge at a drastically slower rate in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2023

Low Rank Matrix Completion via Robust Alternating Minimization in Nearly Linear Time

Given a matrix M∈ℝ^m× n, the low rank matrix completion problem asks us ...
research
12/03/2013

Understanding Alternating Minimization for Matrix Completion

Alternating Minimization is a widely used and empirically successful heu...
research
02/18/2021

Recovering orthogonal tensors under arbitrarily strong, but locally correlated, noise

We consider the problem of recovering an orthogonally decomposable tenso...
research
05/28/2020

Enhanced nonconvex low-rank representation for tensor completion

Higher-order low-rank tensor arises in many data processing applications...
research
03/06/2017

Orthogonalized ALS: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use

The popular Alternating Least Squares (ALS) algorithm for tensor decompo...
research
11/11/2019

Nonconvex Low-Rank Symmetric Tensor Completion from Noisy Data

We study a noisy symmetric tensor completion problem of broad practical ...
research
03/02/2015

Simple, Efficient, and Neural Algorithms for Sparse Coding

Sparse coding is a basic task in many fields including signal processing...

Please sign up or login with your details

Forgot password? Click here to reset