Analysis of regularized Nyström subsampling for regression functions of low smoothness

06/03/2018
by   Shuai Lu, et al.
0

This paper studies a Nyström type subsampling approach to large kernel learning methods in the misspecified case, where the target function is not assumed to belong to the reproducing kernel Hilbert space generated by the underlying kernel. This case is less understood, in spite of its practical importance. To model such a case, the smoothness of target functions is described in terms of general source conditions. It is surprising that almost for the whole range of the source conditions, describing the misspecified case, the corresponding learning rate bounds can be achieved with just one value of the regularization parameter. This observation allows a formulation of mild conditions under which the plain Nyström subsampling can be realized with subquadratic cost maintaining the guaranteed learning rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2014

Learning rates for the risk of kernel based quantile regression estimators in additive models

Additive models play an important role in semiparametric statistics. Thi...
research
11/20/2022

Statistical Optimality of Divide and Conquer Kernel-based Functional Linear Regression

Previous analysis of regularized functional linear regression in a repro...
research
11/02/2019

Convergence results and low order rates for nonlinear Tikhonov regularization with oversmoothing penalty term

For the Tikhonov regularization of ill-posed nonlinear operator equation...
research
08/15/2023

On regularized Radon-Nikodym differentiation

We discuss the problem of estimating Radon-Nikodym derivatives. This pro...
research
05/24/2016

Convergence guarantees for kernel-based quadrature rules in misspecified settings

Kernel-based quadrature rules are becoming important in machine learning...
research
04/16/2022

PAC-Bayesian Based Adaptation for Regularized Learning

In this paper, we propose a PAC-Bayesian a posteriori parameter selectio...
research
02/28/2022

On the Benefits of Large Learning Rates for Kernel Methods

This paper studies an intriguing phenomenon related to the good generali...

Please sign up or login with your details

Forgot password? Click here to reset