On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization

Eric B. Gutierrez, Claire Delplancke, Matthias J. Ehrhardt

Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is a practical tool to solve nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. The proof makes use of a classical supermartingale result, and also rewrites the algorithm as a sequence of random continuous operators in the primal-dual space. We compare our analysis with a similar argument by Alacaoglu et al., and give sufficient conditions for an unproven claim in their proof.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment