Communication-Efficient Weighted Sampling and Quantile Summary for GBDT

09/17/2019
by   Ziyue Huang, et al.
0

Gradient boosting decision tree (GBDT) is a powerful and widely-used machine learning model, which has achieved state-of-the-art performance in many academic areas and production environment. However, communication overhead is the main bottleneck in distributed training which can handle the massive data nowadays. In this paper, we propose two novel communication-efficient methods over distributed dataset to mitigate this problem, a weighted sampling approach by which we can estimate the information gain over a small subset efficiently, and distributed protocols for weighted quantile problem used in approximate tree learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset