Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Acceleration Methods

Alexandre d'Aspremont 1, 2 Damien Scieur 3 Adrien Taylor 2
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : This monograph covers some recent advances on a range of acceleration techniques frequently used in convex optimization. We first use quadratic optimization problems to introduce two key families of methods, momentum and nested optimization schemes, which coincide in the quadratic case to form the Chebyshev method whose complexity is analyzed using Chebyshev polynomials. We discuss momentum methods in detail, starting with the seminal work of Nesterov (1983) and structure convergence proofs using a few master templates, such as that of \emph{optimized gradient methods} which have the key benefit of showing how momentum methods maximize convergence rates. We further cover proximal acceleration techniques, at the heart of the \emph{Catalyst} and \emph{Accelerated Hybrid Proximal Extragradient} frameworks, using similar algorithmic patterns. Common acceleration techniques directly rely on the knowledge of some regularity parameters of the problem at hand, and we conclude by discussing \emph{restart} schemes, a set of simple techniques to reach nearly optimal convergence rates while adapting to unobserved regularity parameters.
Complete list of metadata

https://hal.inria.fr/hal-03154589
Contributor : Adrien Taylor <>
Submitted on : Monday, March 1, 2021 - 10:04:42 AM
Last modification on : Friday, March 5, 2021 - 3:28:57 AM

Links full text

Identifiers

  • HAL Id : hal-03154589, version 1
  • ARXIV : 2101.09545

Collections

Citation

Alexandre d'Aspremont, Damien Scieur, Adrien Taylor. Acceleration Methods. 2021. ⟨hal-03154589⟩

Share

Metrics

Record views

1993