A Universal Catalyst for First-Order Optimization - LEAR Access content directly
Preprints, Working Papers, ... Year : 2015

A Universal Catalyst for First-Order Optimization

Hongzhou Lin
  • Function : Author
  • PersonId : 966904
Julien Mairal
Zaid Harchaoui
  • Function : Author
  • PersonId : 966905

Abstract

We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nes-terov. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, including gradient descent, block coordinate descent, SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these approaches, we provide acceleration and explicit support for non-strongly convex objectives. In addition to theoretical speed-up, we also show that acceleration is useful in practice, especially for ill-conditioned problems where we measure dramatic improvements.
Fichier principal
Vignette du fichier
paper.pdf (456.74 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01160728 , version 1 (06-06-2015)
hal-01160728 , version 2 (25-10-2015)

Identifiers

  • HAL Id : hal-01160728 , version 1

Cite

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Universal Catalyst for First-Order Optimization. 2015. ⟨hal-01160728v1⟩
1589 View
645 Download

Share

Gmail Facebook X LinkedIn More