Support Vector Regression: Risk Quadrangle Framework

12/18/2022
by   Anton Malandii, et al.
0

This paper investigates Support Vector Regression (SVR) in the context of the fundamental risk quadrangle paradigm. It is shown that both formulations of SVR, ε-SVR and ν-SVR, correspond to the minimization of equivalent regular error measures (Vapnik error and superquantile (CVaR) norm, respectively) with a regularization penalty. These error measures, in turn, give rise to corresponding risk quadrangles. By constructing the fundamental risk quadrangle, which corresponds to SVR, we show that SVR is the asymptotically unbiased estimator of the average of two symmetric conditional quantiles. Furthermore, the technique used for the construction of quadrangles serves as a powerful tool in proving the equivalence between ε-SVR and ν-SVR. Additionally, SVR is formulated as a regular deviation minimization problem with a regularization penalty by invoking Error Shaping Decomposition of Regression and the dual formulation of SVR in the risk quadrangle framework is derived.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset