Robust, randomized preconditioning for kernel ridge regression

04/24/2023
by   Mateo Díaz, et al.
0

This paper introduces two randomized preconditioning techniques for robustly solving kernel ridge regression (KRR) problems with a medium to large number of data points (10^4 ≤ N ≤ 10^7). The first method, RPCholesky preconditioning, is capable of accurately solving the full-data KRR problem in O(N^2) arithmetic operations, assuming sufficiently rapid polynomial decay of the kernel matrix eigenvalues. The second method, KRILL preconditioning, offers an accurate solution to a restricted version of the KRR problem involving k ≪ N selected data centers at a cost of O((N + k^2) k log k) operations. The proposed methods solve a broad range of KRR problems and overcome the failure modes of previous KRR preconditioners, making them ideal for practical applications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset