Analysis and Implementation of an Asynchronous Optimization Algorithm for the Parameter Server

Arda Aytekin, Hamid Reza Feyzmahdavian, Mikael Johansson

This paper presents an asynchronous incremental aggregated gradient algorithm and its implementation in a parameter server framework for solving regularized optimization problems. The algorithm can handle both general convex (possibly non-smooth) regularizers and general convex constraints. When the empirical data loss is strongly convex, we establish linear convergence rate, give explicit expressions for step-size choices that guarantee convergence to the optimum, and bound the associated convergence factors. The expressions have an explicit dependence on the degree of asynchrony and recover classical results under synchronous operation. Simulations and implementations on commercial compute clouds validate our findings.

Knowledge Graph



Sign up or login to leave a comment