Randomized low-rank approximation of parameter-dependent matrices

02/24/2023
by   Daniel Kressner, et al.
0

This work considers the low-rank approximation of a matrix A(t) depending on a parameter t in a compact set D ⊂ℝ^d. Application areas that give rise to such problems include computational statistics and dynamical systems. Randomized algorithms are an increasingly popular approach for performing low-rank approximation and they usually proceed by multiplying the matrix with random dimension reduction matrices (DRMs). Applying such algorithms directly to A(t) would involve different, independent DRMs for every t, which is not only expensive but also leads to inherently non-smooth approximations. In this work, we propose to use constant DRMs, that is, A(t) is multiplied with the same DRM for every t. The resulting parameter-dependent extensions of two popular randomized algorithms, the randomized singular value decomposition and the generalized Nyström method, are computationally attractive, especially when A(t) admits an affine linear decomposition with respect to t. We perform a probabilistic analysis for both algorithms, deriving bounds on the expected value as well as failure probabilities for the L^2 approximation error when using Gaussian random DRMs. Both, the theoretical results and numerical experiments, show that the use of constant DRMs does not impair their effectiveness; our methods reliably return quasi-best low-rank approximations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset