Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice

Abstract : We introduce a generic scheme for accelerating gradient-based optimization methods in the sense of Nesterov. The approach, called Catalyst, builds upon the inexact accelerated proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. One of the key to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the right stopping criterion and the right warm-start strategy. In this paper, we give practical guidelines to use Catalyst and present a comprehensive theoretical analysis of its global complexity. We show that Catalyst applies to a large class of algorithms, including gradient descent, block coordinate descent, incremental algorithms such as SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these methods, we provide acceleration and explicit support for non-strongly convex objectives. We conclude with extensive experiments showing that acceleration is useful in practice, especially for ill-conditioned problems.
Type de document :
Article dans une revue
Journal of Machine Learning Research, Journal of Machine Learning Research, 2018, 18 (212), pp.1-54
Liste complète des métadonnées

Littérature citée [35 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01664934
Contributeur : Julien Mairal <>
Soumis le : mardi 19 juin 2018 - 09:10:25
Dernière modification le : vendredi 29 juin 2018 - 13:07:12

Fichier

17-748.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01664934, version 2

Collections

Citation

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice. Journal of Machine Learning Research, Journal of Machine Learning Research, 2018, 18 (212), pp.1-54. 〈hal-01664934v2〉

Partager

Métriques

Consultations de la notice

129

Téléchargements de fichiers

58