Sum-of-squares meets square loss: Fast rates for agnostic tensor completion

05/30/2019
by   Dylan J. Foster, et al.
0

We study tensor completion in the agnostic setting. In the classical tensor completion problem, we receive n entries of an unknown rank-r tensor and wish to exactly complete the remaining entries. In agnostic tensor completion, we make no assumption on the rank of the unknown tensor, but attempt to predict unknown entries as well as the best rank-r tensor. For agnostic learning of third-order tensors with the square loss, we give the first polynomial time algorithm that obtains a "fast" (i.e., O(1/n)-type) rate improving over the rate obtained by reduction to matrix completion. Our prediction error rate to compete with the best d×d×d tensor of rank-r is Õ(r^2d^3/2/n). We also obtain an exact oracle inequality that trades off estimation and approximation error. Our algorithm is based on the degree-six sum-of-squares relaxation of the tensor nuclear norm. The key feature of our analysis is to show that a certain characterization for the subgradient of the tensor nuclear norm can be encoded in the sum-of-squares proof system. This unlocks the standard toolbox for localization of empirical processes under the square loss, and allows us to establish restricted eigenvalue-type guarantees for various tensor regression models, with tensor completion as a special case. The new analysis of the relaxation complements Barak and Moitra (2016), who gave slow rates for agnostic tensor completion, and Potechin and Steurer (2017), who gave exact recovery guarantees for the noiseless setting. Our techniques are user-friendly, and we anticipate that they will find use elsewhere.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2017

Exact tensor completion with sum-of-squares

We obtain the first polynomial-time algorithm for exact tensor completio...
research
01/26/2015

Noisy Tensor Completion via the Sum-of-Squares Hierarchy

In the noisy tensor completion problem we observe m entries (whose locat...
research
11/18/2020

Exact nuclear norm, completion and decomposition for random overcomplete tensors via degree-4 SOS

In this paper we show that simple semidefinite programs inspired by degr...
research
05/07/2014

On Tensor Completion via Nuclear Norm Minimization

Many problems can be formulated as recovering a low-rank tensor. Althoug...
research
06/29/2021

Asymptotic Log-Det Sum-of-Ranks Minimization via Tensor (Alternating) Iteratively Reweighted Least Squares

Affine sum-of-ranks minimization (ASRM) generalizes the affine rank mini...
research
02/22/2017

On Polynomial Time Methods for Exact Low Rank Tensor Completion

In this paper, we investigate the sample size requirement for exact reco...
research
01/31/2013

Rank regularization and Bayesian inference for tensor completion and extrapolation

A novel regularizer of the PARAFAC decomposition factors capturing the t...

Please sign up or login with your details

Forgot password? Click here to reset