Communication-Efficient Federated Bilevel Optimization with Local and Global Lower Level Problems
Bilevel Optimization has witnessed notable progress recently with new emerging efficient algorithms, yet it is underexplored in the Federated Learning setting. It is unclear how the challenges of Federated Learning affect the convergence of bilevel algorithms. In this work, we study Federated Bilevel Optimization problems. We first propose the FedBiO algorithm that solves the hyper-gradient estimation problem efficiently, then we propose FedBiOAcc to accelerate FedBiO. FedBiO has communication complexity O(ϵ^-1.5) with linear speed up, while FedBiOAcc achieves communication complexity O(ϵ^-1), sample complexity O(ϵ^-1.5) and also the linear speed up. We also study Federated Bilevel Optimization problems with local lower level problems, and prove that FedBiO and FedBiOAcc converges at the same rate with some modification.
READ FULL TEXT