Asynch-SGBDT: Asynchronous Parallel Stochastic Gradient Boosting Decision Tree based on Parameters Server

04/12/2018
by   Cheng Daning, et al.
0

Gradient Boosting Decision Tree, i.e. GBDT, becomes one of the most important machine learning algorithms. However, the training process of GBDT needs a lot of computational resources and time even using fork-join parallel method and sampling technology. In order to accelerate the training process of GBDT, asynchronous parallel stochastic gradient boosting decision tree, abbr. asynch-SGBDT is proposed in this paper. Via changing the view of sampling, we adapt the numerical optimization process of traditional GBDT training process into stochastic optimization process and use asynchronous parallel stochastic gradient descent to accelerate the GBDT training process. Asynch-SGBDT provides good compatibility with Parameters Server. Meanwhile, the theoretical analysis of asynch-SGBDT is provided by us in this paper. Experimental results show that GBDT training process could be accelerated by asynch-SGBDT. Our asynchronous parallel strategy achieves an almost linear speedup, especially for high-dimensional sparse datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset