Analysis of Optimization Algorithms via Sum-of-Squares

Sandra S. Y. Tan, Antonios Varvitsiotis, Vincent Y. F. Tan

In this work, we introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders them particularly relevant for applications in machine learning and large-scale data analysis. Our approach is based on sum-of-squares optimization, which allows to introduce a hierarchy of semidefinite programs (SDPs) that give increasingly better convergence bounds for higher levels of the hierarchy. The (dual of the) first level of the sum-of-squares hierarchy corresponds to the SDP reformulation of the Performance Estimation Problem, first introduced by Drori and Teboulle [Math. Program., 145(1):451-482, 2014] and developed further by Taylor, Hendrickx, and Glineur [Math. Program., 161(1):307-345, 2017]. Illustrating the usefulness of our approach, we recover, in a unified manner, several known convergence bounds for four widely-used first-order algorithms, and also derive new convergence results for noisy gradient descent with inexact line search methods.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment