A Universal Catalyst for First-Order Optimization - Archive ouverte HAL Access content directly
Conference Papers Year :

A Universal Catalyst for First-Order Optimization

(1) , (1) , (2, 1)
1
2

Abstract

We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated proximal point algorithm. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, including gradient descent, block coordinate descent, SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these methods, we provide acceleration and explicit support for non-strongly convex objectives. In addition to theoretical speed-up, we also show that acceleration is useful in practice, especially for ill conditioned problems where we measure significant improvements.
Fichier principal
Vignette du fichier
paper_appendix.pdf (428.77 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01160728 , version 1 (06-06-2015)
hal-01160728 , version 2 (25-10-2015)

Identifiers

  • HAL Id : hal-01160728 , version 2

Cite

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Universal Catalyst for First-Order Optimization. NIPS - Advances in Neural Information Processing Systems, Dec 2015, Montreal, Canada. pp. 3384-3392. ⟨hal-01160728v2⟩
1574 View
631 Download

Share

Gmail Facebook Twitter LinkedIn More