A Simple and Efficient Stochastic Algorithm for Decentralized Nonconvex-Strongly-Concave Minimax Optimization
This paper studies the stochastic optimization for decentralized nonconvex-strongly-concave minimax problem. We propose a simple and efficient algorithm, called Decentralized Recursive gradient descEnt Ascent Method (DREAM), which requires 𝒪(κ^3ϵ^-3) stochastic first-order oracle (SFO) calls and 𝒪(κ^2ϵ^-2/√(1-λ_2(W)) ) communication rounds to find an ϵ-stationary point, where κ is the condition number and λ_2(W) is the second-largest eigenvalue of the gossip matrix W. To the best our knowledge, DREAM is the first algorithm whose SFO and communication complexities simultaneously achieve the optimal dependency on ϵ and λ_2(W) for this problem.
READ FULL TEXT