Fast Learning Rate of lp-MKL and its Minimax Optimality

03/27/2011
by   Taiji Suzuki, et al.
0

In this paper, we give a new sharp generalization bound of lp-MKL which is a generalized framework of multiple kernel learning (MKL) and imposes lp-mixed-norm regularization instead of l1-mixed-norm regularization. We utilize localization techniques to obtain the sharp learning rate. The bound is characterized by the decay rate of the eigenvalues of the associated kernels. A larger decay rate gives a faster convergence rate. Furthermore, we give the minimax learning rate on the ball characterized by lp-mixed-norm in the product space. Then we show that our derived learning rate of lp-MKL achieves the minimax optimal rate on the lp-mixed-norm ball.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2011

Fast Convergence Rate of Multiple Kernel Learning with Elastic-net Regularization

We investigate the learning rate of multiple kernel leaning (MKL) with e...
research
11/16/2011

Fast Learning Rate of Non-Sparse Multiple Kernel Learning and Optimal Regularization Strategies

In this paper, we give a new generalization error bound of Multiple Kern...
research
03/03/2011

The Local Rademacher Complexity of Lp-Norm Multiple Kernel Learning

We derive an upper bound on the local Rademacher complexity of ℓ_p-norm ...
research
04/15/2018

Adaptivity for Regularized Kernel Methods by Lepskii's Principle

We address the problem of adaptivity in the framework of reproducing ke...
research
05/21/2020

Extrapolating the profile of a finite population

We study a prototypical problem in empirical Bayes. Namely, consider a p...
research
01/23/2018

Generalized two-dimensional linear discriminant analysis with regularization

Recent advances show that two-dimensional linear discriminant analysis (...
research
12/02/2019

Relating lp regularization and reweighted l1 regularization

We propose a general framework of iteratively reweighted l1 methods for ...

Please sign up or login with your details

Forgot password? Click here to reset