Optimal learning rates for Kernel Conjugate Gradient regression

09/29/2010
by   Gilles Blanchard, et al.
0

We prove rates of convergence in the statistical sense for kernel-based least squares regression using a conjugate gradient algorithm, where regularization against overfitting is obtained by early stopping. This method is directly related to Kernel Partial Least Squares, a regression method that combines supervised dimensionality reduction with least squares projection. The rates depend on two key quantities: first, on the regularity of the target regression function and second, on the intrinsic dimensionality of the data mapped into the kernel space. Lower bounds on attainable rates depending on these two quantities were established in earlier literature, and we obtain upper bounds for the considered method that match these lower bounds (up to a log factor) if the true regression function belongs to the reproducing kernel Hilbert space. If this assumption is not fulfilled, we obtain similar convergence rates provided additional unlabeled data are available. The order of the learning rates match state-of-the-art results that were recently obtained for least squares support vector machines and for linear regularization operators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2016

Convergence rates of Kernel Conjugate Gradient for random design regression

We prove statistical rates of convergence for kernel-based least squares...
research
03/12/2018

Optimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over Hilbert Spaces

We investigate regularized algorithms combining with projection for leas...
research
02/17/2016

Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

We consider the optimization of a quadratic objective function whose gra...
research
11/12/2016

Kernel regression, minimax rates and effective dimensionality: beyond the regular case

We investigate if kernel regularization methods can achieve minimax conv...
research
10/24/2016

Parallelizing Spectral Algorithms for Kernel Learning

We consider a distributed learning approach in supervised learning for a...
research
11/20/2022

Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression

Previous analysis of regularized functional linear regression in a repro...
research
05/21/2012

Conditional mean embeddings as regressors - supplementary

We demonstrate an equivalence between reproducing kernel Hilbert space (...

Please sign up or login with your details

Forgot password? Click here to reset