Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis

04/15/2023
by   Dachao Lin, et al.
0

We study finite-sum distributed optimization problems with n-clients under popular δ-similarity condition and μ-strong convexity. We propose two new algorithms: SVRS and AccSVRS motivated by previous works. The non-accelerated SVRS method combines the techniques of gradient-sliding and variance reduction, which achieves superior communication complexity (n +√(n)δ/μ) compared to existing non-accelerated algorithms. Applying the framework proposed in Katyusha X, we also build a direct accelerated practical version named AccSVRS with totally smoothness-free (n + n^3/4√(δ/μ)) communication complexity that improves upon existing algorithms on ill-conditioning cases. Furthermore, we show a nearly matched lower bound to verify the tightness of our AccSVRS method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset