On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization

12/02/2020
by   Eric B. Gutierrez, et al.
0

Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is a practical tool to solve nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. The proof makes use of a classical supermartingale result, and also rewrites the algorithm as a sequence of random continuous operators in the primal-dual space. We compare our analysis with a similar argument by Alacaoglu et al., and give sufficient conditions for an unproven claim in their proof.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2022

On the convergence and sampling of randomized primal-dual algorithms and their application to parallel MRI reconstruction

The Stochastic Primal-Dual Hybrid Gradient or SPDHG is an algorithm prop...
research
07/07/2014

The Primal-Dual Hybrid Gradient Method for Semiconvex Splittings

This paper deals with the analysis of a recent reformulation of the prim...
research
11/03/2020

Robust Algorithms for Online Convex Problems via Primal-Dual

Primal-dual methods in online optimization give several of the state-of-...
research
05/23/2013

A Primal Condition for Approachability with Partial Monitoring

In approachability with full monitoring there are two types of condition...
research
01/23/2019

A Fully Stochastic Primal-Dual Algorithm

A new stochastic primal-dual algorithm for solving a composite optimizat...
research
03/06/2023

Primal and Dual Analysis of Entropic Fictitious Play for Finite-sum Problems

The entropic fictitious play (EFP) is a recently proposed algorithm that...
research
05/27/2023

Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization

We consider (stochastic) subgradient methods for strongly convex but pot...

Please sign up or login with your details

Forgot password? Click here to reset