Communication Efficient Sparsification for Large Scale Machine Learning

by   Sarit Khirirat, et al.

The increasing scale of distributed learning problems necessitates the development of compression techniques for reducing the information exchange between compute nodes. The level of accuracy in existing compression techniques is typically chosen before training, meaning that they are unlikely to adapt well to the problems that they are solving without extensive hyper-parameter tuning. In this paper, we propose dynamic tuning rules that adapt to the communicated gradients at each iteration. In particular, our rules optimize the communication efficiency at each iteration by maximizing the improvement in the objective function that is achieved per communicated bit. Our theoretical results and experiments indicate that the automatic tuning strategies significantly increase communication efficiency on several state-of-the-art compression schemes.


page 1

page 2

page 3

page 4


Distributed learning with compressed gradients

Asynchronous computation and gradient compression have emerged as two ke...

1-bit Adam: Communication Efficient Large-Scale Training with Adam's Convergence Speed

Scalable training of large models (like BERT and GPT-3) requires careful...

Pufferfish: Communication-efficient Models At No Extra Cost

To mitigate communication overheads in distributed model training, sever...

Hyper-Sphere Quantization: Communication-Efficient SGD for Federated Learning

The high cost of communicating gradients is a major bottleneck for feder...

Private Federated Learning with Autotuned Compression

We propose new techniques for reducing communication in private federate...

Trajectory Normalized Gradients for Distributed Optimization

Recently, researchers proposed various low-precision gradient compressio...

Anytime MiniBatch: Exploiting Stragglers in Online Distributed Optimization

Distributed optimization is vital in solving large-scale machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset