Asymptotic Confidence Sets for General Nonparametric Regression and Classification by Regularized Kernel Methods

03/20/2012
by   Robert Hable, et al.
0

Regularized kernel methods such as, e.g., support vector machines and least-squares support vector regression constitute an important class of standard learning algorithms in machine learning. Theoretical investigations concerning asymptotic properties have manly focused on rates of convergence during the last years but there are only very few and limited (asymptotic) results on statistical inference so far. As this is a serious limitation for their use in mathematical statistics, the goal of the article is to fill this gap. Based on asymptotic normality of many of these methods, the article derives a strongly consistent estimator for the unknown covariance matrix of the limiting normal distribution. In this way, we obtain asymptotically correct confidence sets for ψ(f_P,λ_0) where f_P,λ_0 denotes the minimizer of the regularized risk in the reproducing kernel Hilbert space H and ψ:H→R^m is any Hadamard-differentiable functional. Applications include (multivariate) pointwise confidence sets for values of f_P,λ_0 and confidence sets for gradients, integrals, and norms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset