Catalyst for Gradient-based Nonconvex Optimization

Abstract : We introduce a generic scheme to solve non-convex optimization problems using gradient-based algorithms originally designed for minimizing convex functions. Even though these methods may originally require convexity to operate, the proposed approach allows one to use them without assuming any knowledge about the convexity of the objective. In general, the scheme is guaranteed to produce a stationary point with a worst-case efficiency typical of first-order methods, and when the objective turns out to be convex, it automatically accelerates in the sense of Nesterov and achieves near-optimal convergence rate in function values. We conclude the paper by showing promising experimental results obtained by applying our approach to incremental algorithms such as SVRG and SAGA for sparse matrix factorization and for learning neural networks.
Type de document :
Communication dans un congrès
AISTATS 2018 - 21st International Conference on Artificial Intelligence and Statistics, Apr 2018, Lanzarote, Spain. pp.1-10
Liste complète des métadonnées

Littérature citée [33 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01773296
Contributeur : Julien Mairal <>
Soumis le : samedi 21 avril 2018 - 13:42:13
Dernière modification le : mercredi 20 juin 2018 - 01:08:35

Fichier

paquette18a.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01773296, version 1

Collections

Citation

Courtney Paquette, Hongzhou Lin, Dmitriy Drusvyatskiy, Julien Mairal, Zaid Harchaoui. Catalyst for Gradient-based Nonconvex Optimization. AISTATS 2018 - 21st International Conference on Artificial Intelligence and Statistics, Apr 2018, Lanzarote, Spain. pp.1-10. 〈hal-01773296〉

Partager

Métriques

Consultations de la notice

244

Téléchargements de fichiers

113