An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration

Abstract : We propose an inexact variable-metric proximal point algorithm to accelerate gradient-based optimization algorithms. The proposed scheme, called QNing can be notably applied to incremental first-order methods such as the stochastic variance-reduced gradient descent algorithm (SVRG) and other randomized incremental optimization algorithms. QNing is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. When combined with limited-memory BFGS rules, QNing is particularly effective to solve high-dimensional optimization problems, while enjoying a worst-case linear convergence rate for strongly convex problems. We present experimental results where QNing gives significant improvements over competing methods for training machine learning methods on large samples and in high dimensions.
Liste complète des métadonnées
Contributor : Julien Mairal <>
Submitted on : Monday, January 28, 2019 - 10:28:17 AM
Last modification on : Tuesday, March 19, 2019 - 3:01:34 PM


Files produced by the author(s)


  • HAL Id : hal-01376079, version 4
  • ARXIV : 1610.00960



Hongzhou Lin, Julien Mairal, Zaid Harchaoui. An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration. SIAM Journal on Optimization, Society for Industrial and Applied Mathematics, In press, pp.1-36. ⟨hal-01376079v4⟩



Record views


Files downloads