A Universal Catalyst for First-Order Optimization

Hongzhou Lin 1 Julien Mairal 1 Zaid Harchaoui 2, 1
1 LEAR - Learning and recognition in vision
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
Abstract : We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated proximal point algorithm. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, including gradient descent, block coordinate descent, SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these methods, we provide acceleration and explicit support for non-strongly convex objectives. In addition to theoretical speed-up, we also show that acceleration is useful in practice, especially for ill conditioned problems where we measure significant improvements.
Type de document :
Communication dans un congrès
28th International Conference on Neural Information Processing Systems, NIPS'15, Dec 2015, Montreal, Canada. MIT Press, pp. 3384-3392
Liste complète des métadonnées


https://hal.inria.fr/hal-01160728
Contributeur : Julien Mairal <>
Soumis le : dimanche 25 octobre 2015 - 11:15:13
Dernière modification le : jeudi 9 février 2017 - 14:40:49
Document(s) archivé(s) le : mardi 26 janvier 2016 - 10:43:23

Fichier

paper_appendix.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01160728, version 2

Collections

Citation

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Universal Catalyst for First-Order Optimization. 28th International Conference on Neural Information Processing Systems, NIPS'15, Dec 2015, Montreal, Canada. MIT Press, pp. 3384-3392. <hal-01160728v2>

Partager

Métriques

Consultations de
la notice

987

Téléchargements du document

371