DEED: A General Quantization Scheme for Communication Efficiency in Bits

06/19/2020
by   Tian Ye, et al.
0

In distributed optimization, a popular technique to reduce communication is quantization. In this paper, we provide a general analysis framework for inexact gradient descent that is applicable to quantization schemes. We also propose a quantization scheme Double Encoding and Error Diminishing (DEED). DEED can achieve small communication complexity in three settings: frequent-communication large-memory, frequent-communication small-memory, and infrequent-communication (e.g. federated learning). More specifically, in the frequent-communication large-memory setting, DEED can be easily combined with Nesterov's method, so that the total number of bits required is Õ( √(κ)log 1/ϵ ), where Õ hides numerical constant and logκ factors. In the frequent-communication small-memory setting, DEED combined with SGD only requires Õ( κlog 1/ϵ) number of bits in the interpolation regime. In the infrequent communication setting, DEED combined with Federated averaging requires a smaller total number of bits than Federated Averaging. All these algorithms converge at the same rate as their non-quantized versions, while using a smaller number of bits.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset