Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks

06/08/2021
by   Dmitry Kovalev, et al.
0

We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network whose links are allowed to change in time. We solve two fundamental problems for this task. First, we establish the first lower bounds on the number of decentralized communication rounds and the number of local computations required to find an ϵ-accurate solution. Second, we design two optimal algorithms that attain these lower bounds: (i) a variant of the recently proposed algorithm ADOM (Kovalev et al., 2021) enhanced via a multi-consensus subroutine, which is optimal in the case when access to the dual gradients is assumed, and (ii) a novel algorithm, called ADOM+, which is optimal in the case when access to the primal gradients is assumed. We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2020

Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization

We consider the task of decentralized minimization of the sum of smooth ...
research
02/28/2017

Optimal algorithms for smooth and strongly convex distributed optimization in networks

In this paper, we determine the optimal convergence rates for strongly c...
research
09/03/2018

A Dual Approach for Optimal Algorithms in Distributed Optimization over Networks

We study the optimal convergence rates for distributed convex optimizati...
research
06/11/2020

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

We introduce a framework for designing primal methods under the decentra...
research
07/26/2022

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

DADAO is a novel decentralized asynchronous stochastic algorithm to mini...
research
02/18/2021

ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

We propose ADOM - an accelerated method for smooth and strongly convex d...
research
08/12/2018

Parallelization does not Accelerate Convex Optimization: Adaptivity Lower Bounds for Non-smooth Convex Minimization

In this paper we study the limitations of parallelization in convex opti...

Please sign up or login with your details

Forgot password? Click here to reset