Byzantine Fault Tolerance in Distributed Machine Learning : a Survey
Byzantine Fault Tolerance (BFT) is among the most challenging problems in Distributed Machine Learning (DML). Byzantine failures are still difficult to tackle due to their unrestricted nature; as a result, the possibility of generating arbitrary data. Extensive research efforts are persistently being made to put into effect the BFT in DML. Some recent studies have been undertaken to take into consideration various BFT approaches in DML. However, certain aspects seem to be limited, such as the few analyzed approaches and the absence of the techniques classification employed in the studied approaches. In this paper, we present a survey of recent works surrounding BFT in DML. Mainly in first-order optimization methods, especially Stochastic Gradient Descent (SGD). We highlight the key techniques as well as fundamental approaches. We offer an illustrative description of techniques used in BFT in DML, with a proposed classification of BFTs approaches in the context of their basic techniques. This classification is established on specific criteria such as communication process, optimization method, and topology setting, which characterize future work methods examining the current challenges of BFT in DML
READ FULL TEXT