The Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is an efficient algorithm to solve some nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. We also look into its application to parallel Magnetic Resonance Imaging reconstruction in order to test performance of SPDHG. Our numerical results show that for a range of settings SPDHG converges significantly faster than its deterministic counterpart.