Catalyst Acceleration for Gradient-Based Non-Convex Optimization

Abstract : We introduce a generic scheme to solve nonconvex optimization problems using gradient-based algorithms originally designed for minimizing convex functions. When the objective is convex, the proposed approach enjoys the same properties as the Catalyst approach of Lin et al, 2015. When the objective is nonconvex, it achieves the best known convergence rate to stationary points for first-order methods. Specifically, the proposed algorithm does not require knowledge about the convexity of the objective; yet, it obtains an overall worst-case efficiency of O(ε−2) and, if the function is convex, the complexity reduces to the near-optimal rate O(ε −2/3). We conclude the paper by showing promising experimental results obtained by applying the proposed approach to SVRG and SAGA for sparse matrix factorization and for learning neural networks.
Type de document :
Pré-publication, Document de travail
2017
Liste complète des métadonnées

Littérature citée [39 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01536017
Contributeur : Julien Mairal <>
Soumis le : vendredi 9 juin 2017 - 21:27:31
Dernière modification le : vendredi 24 novembre 2017 - 13:27:10
Document(s) archivé(s) le : dimanche 10 septembre 2017 - 13:58:56

Fichier

arxiv_v2.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01536017, version 1

Collections

Citation

Courtney Paquette, Hongzhou Lin, Dmitriy Drusvyatskiy, Julien Mairal, Zaid Harchaoui. Catalyst Acceleration for Gradient-Based Non-Convex Optimization. 2017. 〈hal-01536017〉

Partager

Métriques

Consultations de la notice

290

Téléchargements de fichiers

60