DeepAI AI Chat
Log In Sign Up

ParK: Sound and Efficient Kernel Ridge Regression by Feature Space Partitions

by   Luigi Carratino, et al.

We introduce ParK, a new large-scale solver for kernel ridge regression. Our approach combines partitioning with random projections and iterative optimization to reduce space and time complexity while provably maintaining the same statistical accuracy. In particular, constructing suitable partitions directly in the feature space rather than in the input space, we promote orthogonality between the local estimators, thus ensuring that key quantities such as local effective dimension and bias remain under control. We characterize the statistical-computational tradeoff of our model, and demonstrate the effectiveness of our method by numerical experiments on large-scale datasets.


page 1

page 2

page 3

page 4


Near Input Sparsity Time Kernel Embeddings via Adaptive Sampling

To accelerate kernel methods, we propose a near input sparsity time algo...

Efficient online learning with kernels for adversarial large scale problems

We are interested in a framework of online learning with kernels for low...

Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates

We establish optimal convergence rates for a decomposition-based scalabl...

Distributed Learning with Random Features

Distributed learning and random projections are the most common techniqu...

Kernel Ridge Regression via Partitioning

In this paper, we investigate a divide and conquer approach to Kernel Ri...

Nonparametric Functional Approximation with Delaunay Triangulation

We propose a differentiable nonparametric algorithm, the Delaunay triang...