Yes, Topology Matters in Decentralized Optimization: Refined Convergence and Topology Learning under Heterogeneous Data
One of the key challenges in federated and decentralized learning is to design algorithms that efficiently deal with highly heterogeneous data distributions across agents. In this paper, we revisit the analysis of Decentralized Stochastic Gradient Descent algorithm (D-SGD), a popular decentralized learning algorithm, under data heterogeneity. We exhibit the key role played by a new quantity, that we call neighborhood heterogeneity, on the convergence rate of D-SGD. Unlike prior work, neighborhood heterogeneity is measured at the level of the neighborhood of an agent in the graph topology. By coupling the topology and the heterogeneity of the agents' distributions, our analysis sheds light on the poorly understood interplay between these two concepts in decentralized learning. We then argue that neighborhood heterogeneity provides a natural criterion to learn sparse data-dependent topologies that reduce (and can even eliminate) the otherwise detrimental effect of data heterogeneity on the convergence time of D-SGD. For the important case of classification with label skew, we formulate the problem of learning such a good topology as a tractable optimization problem that we solve with a Frank-Wolfe algorithm. Our approach provides a principled way to design a sparse topology that balances the number of iterations and the per-iteration communication costs of D-SGD under data heterogeneity.
READ FULL TEXT