Generalized XGBoost Method

09/15/2021
by   Yang Guang, et al.
0

The XGBoost method has many advantages and is especially suitable for statistical analysis of big data, but its loss function is limited to convex functions. In many specific applications, a nonconvex loss function would be preferable. In this paper, we propose a generalized XGBoost method, which requires weaker loss function condition and involves more general loss functions, including convex loss functions and some non-convex loss functions. Furthermore, this generalized XGBoost method is extended to multivariate loss function to form a more generalized XGBoost method. This method is a multivariate regularized tree boosting method, which can model multiple parameters in most of the frequently-used parametric probability distributions to be fitted by predictor variables. Meanwhile, the related algorithms and some examples in non-life insurance pricing are given.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset