Low-Rank Approximation with 1/ε^1/3 Matrix-Vector Products

02/10/2022
by   Ainesh Bakshi, et al.
4

We study iterative methods based on Krylov subspaces for low-rank approximation under any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an accuracy parameter ϵ, and a target rank k, the goal is to find a rank-k matrix Z with orthonormal columns such that A(I -ZZ^⊤)_S_p≤ (1+ϵ)min_U^⊤ U = I_kA(I - U U^⊤)_S_p, where M_S_p denotes the ℓ_p norm of the the singular values of M. For the special cases of p=2 (Frobenius norm) and p = ∞ (Spectral norm), Musco and Musco (NeurIPS 2015) obtained an algorithm based on Krylov methods that uses Õ(k/√(ϵ)) matrix-vector products, improving on the naïve Õ(k/ϵ) dependence obtainable by the power method, where Õ suppresses poly(log(dk/ϵ)) factors. Our main result is an algorithm that uses only Õ(kp^1/6/ϵ^1/3) matrix-vector products, and works for all p ≥ 1. For p = 2 our bound improves the previous Õ(k/ϵ^1/2) bound to Õ(k/ϵ^1/3). Since the Schatten-p and Schatten-∞ norms are the same up to a 1+ ϵ factor when p ≥ (log d)/ϵ, our bound recovers the result of Musco and Musco for p = ∞. Further, we prove a matrix-vector query lower bound of Ω(1/ϵ^1/3) for any fixed constant p ≥ 1, showing that surprisingly Θ̃(1/ϵ^1/3) is the optimal complexity for constant k. To obtain our results, we introduce several new techniques, including optimizing over multiple Krylov subspaces simultaneously, and pinching inequalities for partitioned operators. Our lower bound for p ∈ [1,2] uses the Araki-Lieb-Thirring trace inequality, whereas for p>2, we appeal to a norm-compression inequality for aligned partitioned operators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2023

Krylov Methods are (nearly) Optimal for Low-Rank Approximation

We consider the problem of rank-1 low-rank approximation (LRA) in the ma...
research
04/27/2020

Input-Sparsity Low Rank Approximation in Schatten Norm

We give the first input-sparsity time algorithms for the rank-k low rank...
research
11/01/2021

Improved Algorithms for Low Rank Approximation from Sparsity

We overcome two major bottlenecks in the study of low rank approximation...
research
12/19/2022

Matrix recovery from matrix-vector products

Can one recover a matrix efficiently from only matrix-vector products? I...
research
08/24/2022

Resolving Matrix Spencer Conjecture Up to Poly-logarithmic Rank

We give a simple proof of the matrix Spencer conjecture up to poly-logar...
research
05/13/2020

Two equalities expressing the determinant of a matrix in terms of expectations over matrix-vector products

We introduce two equations expressing the inverse determinant of a full ...
research
02/22/2021

Quantum query complexity with matrix-vector products

We study quantum algorithms that learn properties of a matrix using quer...

Please sign up or login with your details

Forgot password? Click here to reset