MixBoost: A Heterogeneous Boosting Machine

06/17/2020
by   Thomas Parnell, et al.
0

Modern gradient boosting software frameworks, such as XGBoost and LightGBM, implement Newton descent in a functional space. At each boosting iteration, their goal is to find the base hypothesis, selected from some base hypothesis class, that is closest to the Newton descent direction in a Euclidean sense. Typically, the base hypothesis class is fixed to be all binary decision trees up to a given depth. In this work, we study a Heterogeneous Newton Boosting Machine (HNBM) in which the base hypothesis class may vary across boosting iterations. Specifically, at each boosting iteration, the base hypothesis class is chosen, from a fixed set of subclasses, by sampling from a probability distribution. We derive a global linear convergence rate for the HNBM under certain assumptions, and show that it agrees with existing rates for Newton's method when the Newton direction can be perfectly fitted by the base hypothesis at each boosting iteration. We then describe a particular realization of a HNBM, MixBoost, that, at each boosting iteration, randomly selects between either a decision tree of variable depth or a linear regressor with random Fourier features. We describe how MixBoost is implemented, with a focus on the training complexity. Finally, we present experimental results, using OpenML and Kaggle datasets, that show that MixBoost is able to achieve better generalization loss than competing boosting frameworks, without taking significantly longer to tune.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2018

Gradient and Newton Boosting for Classification and Regression

Boosting algorithms enjoy large popularity due to their high predictive ...
research
06/07/2020

Soft Gradient Boosting Machine

Gradient Boosting Machine has proven to be one successful function appro...
research
11/23/2013

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class class...
research
04/03/2022

FedGBF: An efficient vertical federated learning framework via gradient boosting and bagging

Federated learning, conducive to solving data privacy and security probl...
research
05/22/2022

Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

The work in ICML'09 showed that the derivatives of the classical multi-c...
research
10/29/2019

Minimal Variance Sampling in Stochastic Gradient Boosting

Stochastic Gradient Boosting (SGB) is a widely used approach to regulari...
research
11/14/2020

MP-Boost: Minipatch Boosting via Adaptive Feature and Observation Sampling

Boosting methods are among the best general-purpose and off-the-shelf ma...

Please sign up or login with your details

Forgot password? Click here to reset