Trading Communication for Computation in Byzantine-Resilient Gradient Coding

03/23/2023
by   Christoph Hofmeister, et al.
0

We consider gradient coding in the presence of an adversary, controlling so-called malicious workers trying to corrupt the computations. Previous works propose the use of MDS codes to treat the inputs of the malicious workers as errors and correct them using the error-correction properties of the code. This comes at the expense of increasing the replication, i.e., the number of workers each partial gradient is computed by. In this work, we reduce replication by proposing a method that detects the erroneous inputs from the malicious workers, hence transforming them into erasures. For s malicious workers, our solution can reduce the replication to s+1 instead of 2s+1 for each partial gradient at the expense of only s additional computations at the main node and additional rounds of light communication between the main node and the workers. We give fundamental limits of the general framework for fractional repetition data allocation. Our scheme is optimal in terms of replication and local computation but incurs a communication cost that is asymptotically, in the size of the dataset, a multiplicative factor away from the derived bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset