Efficiency of First-Order Methods for Low-Rank Tensor Recovery with the Tensor Nuclear Norm Under Strict Complementarity

08/03/2023
by   Dan Garber, et al.
0

We consider convex relaxations for recovering low-rank tensors based on constrained minimization over a ball induced by the tensor nuclear norm, recently introduced in <cit.>. We build on a recent line of results that considered convex relaxations for the recovery of low-rank matrices and established that under a strict complementarity condition (SC), both the convergence rate and per-iteration runtime of standard gradient methods may improve dramatically. We develop the appropriate strict complementarity condition for the tensor nuclear norm ball and obtain the following main results under this condition: 1. When the objective to minimize is of the form f()=g()+⟨,⟩ , where g is strongly convex and is a linear map (e.g., least squares), a quadratic growth bound holds, which implies linear convergence rates for standard projected gradient methods, despite the fact that f need not be strongly convex. 2. For a smooth objective function, when initialized in certain proximity of an optimal solution which satisfies SC, standard projected gradient methods only require SVD computations (for projecting onto the tensor nuclear norm ball) of rank that matches the tubal rank of the optimal solution. In particular, when the tubal rank is constant, this implies nearly linear (in the size of the tensor) runtime per iteration, as opposed to super linear without further assumptions. 3. For a nonsmooth objective function which admits a popular smooth saddle-point formulation, we derive similar results to the latter for the well known extragradient method. An additional contribution which may be of independent interest, is the rigorous extension of many basic results regarding tensors of arbitrary order, which were previously obtained only for third-order tensors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2017

Near-optimal sample complexity for convex tensor completion

We analyze low rank tensor completion (TC) using noisy measurements of a...
research
08/07/2017

Linear Convergence of a Frank-Wolfe Type Algorithm over Trace-Norm Balls

We propose a rank-k variant of the classical Frank-Wolfe algorithm to so...
research
05/31/2020

Revisiting Frank-Wolfe for Polytopes: Strict Complementary and Sparsity

In recent years it was proved that simple modifications of the classical...
research
12/03/2019

Linear Convergence of Frank-Wolfe for Rank-One Matrix Recovery Without Strong Convexity

We consider convex optimization problems which are widely used as convex...
research
03/18/2015

Interpolating Convex and Non-Convex Tensor Decompositions via the Subspace Norm

We consider the problem of recovering a low-rank tensor from its noisy o...
research
06/28/2021

Asymptotic Log-Det Rank Minimization via (Alternating) Iteratively Reweighted Least Squares

The affine rank minimization (ARM) problem is well known for both its ap...

Please sign up or login with your details

Forgot password? Click here to reset