On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization

12/02/2020
by   Eric B. Gutierrez, et al.
0

Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is a practical tool to solve nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. The proof makes use of a classical supermartingale result, and also rewrites the algorithm as a sequence of random continuous operators in the primal-dual space. We compare our analysis with a similar argument by Alacaoglu et al., and give sufficient conditions for an unproven claim in their proof.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset