Linear and Sublinear Time Spectral Density Estimation

04/08/2021
by   Vladimir Braverman, et al.
0

We analyze the popular kernel polynomial method (KPM) for approximating the spectral density (eigenvalue distribution) of a real symmetric (or Hermitian) matrix A ∈ℝ^n× n. We prove that a simple and practical variant of the KPM algorithm can approximate the spectral density to ϵ accuracy in the Wasserstein-1 distance with roughly O(1/ϵ) matrix-vector multiplications with A. This yields a provable linear time result for the problem. The KPM variant we study is based on damped Chebyshev polynomial expansions. We show that it is stable, meaning that it can be combined with any approximate matrix-vector multiplication algorithm for A. As an application, we develop an O(n/poly(ϵ)) time algorithm for computing the spectral density of any n× n normalized graph adjacency or Laplacian matrix. This runtime is sublinear in the size of the matrix, and assumes sample access to the graph. Our approach leverages several tools from approximation theory, including Jackson's seminal work on approximation with positive kernels [Jackson, 1912], and stability properties of three-term recurrence relations for orthogonal polynomials.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset