Skip to Main content Skip to Navigation
Conference papers

Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees

Adrien Taylor 1, 2 Bryan van Scoy 3 Laurent Lessard 4, 3
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori & Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).
Complete list of metadata
Contributor : Adrien Taylor <>
Submitted on : Thursday, October 25, 2018 - 11:45:42 AM
Last modification on : Thursday, July 1, 2021 - 5:58:09 PM

Links full text


  • HAL Id : hal-01902068, version 1
  • ARXIV : 1803.06073



Adrien Taylor, Bryan van Scoy, Laurent Lessard. Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees. Proceedings of the 35th International Conference on Machine Learning. PMLR 80:4897-4906, Jul 2018, Stockholm, Sweden. ⟨hal-01902068⟩



Record views