On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization
Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is a practical tool to solve nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. The proof makes use of a classical supermartingale result, and also rewrites the algorithm as a sequence of random continuous operators in the primal-dual space. We compare our analysis with a similar argument by Alacaoglu et al., and give sufficient conditions for an unproven claim in their proof.
READ FULL TEXT