Testing Positive Semidefiniteness Using Linear Measurements

04/08/2022
by   Deanna Needell, et al.
0

We study the problem of testing whether a symmetric d × d input matrix A is symmetric positive semidefinite (PSD), or is ϵ-far from the PSD cone, meaning that λ_min(A) ≤ - ϵA_p, where A_p is the Schatten-p norm of A. In applications one often needs to quickly tell if an input matrix is PSD, and a small distance from the PSD cone may be tolerable. We consider two well-studied query models for measuring efficiency, namely, the matrix-vector and vector-matrix-vector query models. We first consider one-sided testers, which are testers that correctly classify any PSD input, but may fail on a non-PSD input with a tiny failure probability. Up to logarithmic factors, in the matrix-vector query model we show a tight Θ(1/ϵ^p/(2p+1)) bound, while in the vector-matrix-vector query model we show a tight Θ(d^1-1/p/ϵ) bound, for every p ≥ 1. We also show a strong separation between one-sided and two-sided testers in the vector-matrix-vector model, where a two-sided tester can fail on both PSD and non-PSD inputs with a tiny failure probability. In particular, for the important case of the Frobenius norm, we show that any one-sided tester requires Ω(√(d)/ϵ) queries. However we introduce a bilinear sketch for two-sided testing from which we construct a Frobenius norm tester achieving the optimal O(1/ϵ^2) queries. We also give a number of additional separations between adaptive and non-adaptive testers. Our techniques have implications beyond testing, providing new methods to approximate the spectrum of a matrix with Frobenius norm error using dimensionality reduction in a way that preserves the signs of eigenvalues.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro