Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

A Universal Catalyst for First-Order Optimization

Hongzhou Lin 1 Julien Mairal 1 Zaid Harchaoui 1
1 LEAR - Learning and recognition in vision
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
Abstract : We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nes-terov. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, including gradient descent, block coordinate descent, SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these approaches, we provide acceleration and explicit support for non-strongly convex objectives. In addition to theoretical speed-up, we also show that acceleration is useful in practice, especially for ill-conditioned problems where we measure dramatic improvements.
Complete list of metadatas
Contributor : Julien Mairal <>
Submitted on : Saturday, June 6, 2015 - 9:56:32 PM
Last modification on : Friday, April 17, 2020 - 11:46:03 AM
Document(s) archivé(s) le : Tuesday, September 15, 2015 - 11:57:43 AM


Files produced by the author(s)


  • HAL Id : hal-01160728, version 1


Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Universal Catalyst for First-Order Optimization. 2015. ⟨hal-01160728v1⟩



Record views


Files downloads