Kernel regression, minimax rates and effective dimensionality: beyond the regular case

11/12/2016
by   Gilles Blanchard, et al.
0

We investigate if kernel regularization methods can achieve minimax convergence rates over a source condition regularity assumption for the target function. These questions have been considered in past literature, but only under specific assumptions about the decay, typically polynomial, of the spectrum of the the kernel mapping covariance operator. In the perspective of distribution-free results, we investigate this issue under much weaker assumption on the eigenvalue decay, allowing for more complex behavior that can reflect different structure of the data at different scales.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2016

Convergence rates of Kernel Conjugate Gradient for random design regression

We prove statistical rates of convergence for kernel-based least squares...
research
11/07/2016

Optimal rates for the regularized learning algorithms under general source condition

We consider the learning algorithms under general source condition with ...
research
09/29/2010

Optimal learning rates for Kernel Conjugate Gradient regression

We prove rates of convergence in the statistical sense for kernel-based ...
research
05/31/2021

Generalization Error Rates in Kernel Regression: The Crossover from the Noiseless to Noisy Regime

In this manuscript we consider Kernel Ridge Regression (KRR) under the G...
research
01/29/2022

Error Rates for Kernel Classification under Source and Capacity Conditions

In this manuscript, we consider the problem of kernel classification und...
research
09/25/2022

Capacity dependent analysis for functional online learning algorithms

This article provides convergence analysis of online stochastic gradient...
research
05/05/2023

Random Smoothing Regularization in Kernel Gradient Descent Learning

Random smoothing data augmentation is a unique form of regularization th...

Please sign up or login with your details

Forgot password? Click here to reset