Near-optimal sample complexity for convex tensor completion
We analyze low rank tensor completion (TC) using noisy measurements of a subset of the tensor. Assuming a rank-r, order-d, N × N ×...× N tensor where r=O(1), the best sampling complexity that was achieved is O(N^d/2), which is obtained by solving a tensor nuclear-norm minimization problem. However, this bound is significantly larger than the number of free variables in a low rank tensor which is O(dN). In this paper, we show that by using an atomic-norm whose atoms are rank-1 sign tensors, one can obtain a sample complexity of O(dN). Moreover, we generalize the matrix max-norm definition to tensors, which results in a max-quasi-norm (max-qnorm) whose unit ball has small Rademacher complexity. We prove that solving a constrained least squares estimation using either the convex atomic-norm or the nonconvex max-qnorm results in optimal sample complexity for the problem of low-rank tensor completion. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained tensor completion, showing improved recovery results compared to matricization and alternating least squares.
READ FULL TEXT