Hierarchical Online Convex Optimization

06/25/2021
by   Juncheng Wang, et al.
0

We consider online convex optimization (OCO) over a heterogeneous network with communication delay, where multiple workers together with a master execute a sequence of decisions to minimize the accumulation of time-varying global costs. The local data may not be independent or identically distributed, and the global cost functions may not be locally separable. Due to communication delay, neither the master nor the workers have in-time information about the current global cost function. We propose a new algorithm, termed Hierarchical OCO (HiOCO), which takes full advantage of the network heterogeneity in information timeliness and computation capacity to enable multi-step gradient descent at both the workers and the master. We analyze the impacts of the unique hierarchical architecture, multi-slot delay, and gradient estimation error to derive upper bounds on the dynamic regret of HiOCO, which measures the gap of costs between HiOCO and an offline globally optimal performance benchmark.

READ FULL TEXT
research
05/09/2021

Delay-Tolerant Constrained OCO with Application to Network Resource Allocation

We consider online convex optimization (OCO) with multi-slot feedback de...
research
07/01/2020

Distributed Linearly Separable Computation

This paper formulates a distributed computation problem, where a master ...
research
09/23/2021

Coded Computation across Shared Heterogeneous Workers with Communication Delay

Distributed computing enables large-scale computation tasks to be proces...
research
01/30/2023

Delayed Stochastic Algorithms for Distributed Weakly Convex Optimization

This paper studies delayed stochastic algorithms for weakly convex optim...
research
10/04/2020

On the Tradeoff Between Computation and Communication Costs for Distributed Linearly Separable Computation

This paper studies the distributed linearly separable computation proble...
research
12/07/2021

Improving Dynamic Regret in Distributed Online Mirror Descent Using Primal and Dual Information

We consider the problem of distributed online optimization, with a group...
research
12/15/2020

Anytime Minibatch with Delayed Gradients

Distributed optimization is widely deployed in practice to solve a broad...

Please sign up or login with your details

Forgot password? Click here to reset