Matrix Completion in Almost-Verification Time

08/07/2023
∙
by   Jonathan A. Kelner, et al.
∙
0
∙

We give a new framework for solving the fundamental problem of low-rank matrix completion, i.e., approximating a rank-r matrix 𝐌∈ℝ^m × n (where m ≥ n) from random observations. First, we provide an algorithm which completes 𝐌 on 99% of rows and columns under no further assumptions on 𝐌 from ≈ mr samples and using ≈ mr^2 time. Then, assuming the row and column spans of 𝐌 satisfy additional regularity properties, we show how to boost this partial completion guarantee to a full matrix completion algorithm by aggregating solutions to regression problems involving the observations. In the well-studied setting where 𝐌 has incoherent row and column spans, our algorithms complete 𝐌 to high precision from mr^2+o(1) observations in mr^3 + o(1) time (omitting logarithmic factors in problem parameters), improving upon the prior state-of-the-art [JN15] which used ≈ mr^5 samples and ≈ mr^7 time. Under an assumption on the row and column spans of 𝐌 we introduce (which is satisfied by random subspaces with high probability), our sample complexity improves to an almost information-theoretically optimal mr^1 + o(1), and our runtime improves to mr^2 + o(1). Our runtimes have the appealing property of matching the best known runtime to verify that a rank-r decomposition 𝐔𝐕^⊤ agrees with the sampled observations. We also provide robust variants of our algorithms that, given random observations from 𝐌 + 𝐍 with 𝐍_F≤Δ, complete 𝐌 to Frobenius norm distance ≈ r^1.5Δ in the same runtimes as the noiseless setting. Prior noisy matrix completion algorithms [CP10] only guaranteed a distance of ≈√(n)Δ.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset