Optimal γ and C for ε-Support Vector Regression with RBF Kernels

06/12/2015
by   Longfei Lu, et al.
Outlook.com
0

The objective of this study is to investigate the efficient determination of C and γ for Support Vector Regression with RBF or mahalanobis kernel based on numerical and statistician considerations, which indicates the connection between C and kernels and demonstrates that the deviation of geometric distance of neighbour observation in mapped space effects the predict accuracy of ϵ-SVR. We determinate the arrange of γ & C and propose our method to choose their best values.

READ FULL TEXT
03/18/2016

Generalized support vector regression: duality and tensor-kernel representation

In this paper we study the variational problem associated to support vec...
09/26/2022

Convex Support Vector Regression

Nonparametric regression subject to convexity or concavity constraints i...
12/08/2017

Learning 2D Gabor Filters by Infinite Kernel Learning Regression

Gabor functions have wide-spread applications in image processing and co...
07/29/2021

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...
07/10/2018

An Empirical Approach For Probing the Definiteness of Kernels

Models like support vector machines or Gaussian process regression often...
01/30/2022

A least squares support vector regression for anisotropic diffusion filtering

Anisotropic diffusion filtering for signal smoothing as a low-pass filte...

Please sign up or login with your details

Forgot password? Click here to reset