An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration

Abstract : We propose an inexact variable-metric proximal point algorithm to accelerate gradient-based optimization algorithms. The proposed scheme, called QNing can be notably applied to incremental first-order methods such as the stochastic variance-reduced gradient descent algorithm (SVRG) and other randomized incremental optimization algorithms. QNing is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. When combined with limited-memory BFGS rules, QNing is particularly effective to solve high-dimensional optimization problems, while enjoying a worst-case linear convergence rate for strongly convex problems. We present experimental results where QNing gives significant improvements over competing methods for training machine learning methods on large samples and in high dimensions.
Type de document :
Pré-publication, Document de travail
Liste complète des métadonnées
Contributeur : Julien Mairal <>
Soumis le : vendredi 20 juillet 2018 - 15:50:30
Dernière modification le : samedi 2 février 2019 - 18:48:42
Document(s) archivé(s) le : dimanche 21 octobre 2018 - 12:16:19


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01376079, version 3
  • ARXIV : 1610.00960


Hongzhou Lin, Julien Mairal, Zaid Harchaoui. An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration. 2018. 〈hal-01376079v3〉



Consultations de la notice


Téléchargements de fichiers