HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Acceleration Methods

Abstract : This monograph covers some recent advances on a range of acceleration techniques frequently used in convex optimization. We first use quadratic optimization problems to introduce two key families of methods, momentum and nested optimization schemes, which coincide in the quadratic case to form the Chebyshev method whose complexity is analyzed using Chebyshev polynomials. We discuss momentum methods in detail, starting with the seminal work of Nesterov (1983) and structure convergence proofs using a few master templates, such as that of \emph{optimized gradient methods} which have the key benefit of showing how momentum methods maximize convergence rates. We further cover proximal acceleration techniques, at the heart of the \emph{Catalyst} and \emph{Accelerated Hybrid Proximal Extragradient} frameworks, using similar algorithmic patterns. Common acceleration techniques directly rely on the knowledge of some regularity parameters of the problem at hand, and we conclude by discussing \emph{restart} schemes, a set of simple techniques to reach nearly optimal convergence rates while adapting to unobserved regularity parameters.
Complete list of metadata

Contributor : Adrien Taylor Connect in order to contact the contributor
Submitted on : Monday, March 1, 2021 - 10:04:42 AM
Last modification on : Thursday, March 17, 2022 - 10:08:54 AM

Links full text


  • HAL Id : hal-03154589, version 1
  • ARXIV : 2101.09545



Alexandre d'Aspremont, Damien Scieur, Adrien Taylor. Acceleration Methods. 2021. ⟨hal-03154589⟩



Record views