Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration

Abstract : We propose an inexact variable-metric proximal point algorithm to accelerate gradient-based optimization algorithms. The proposed scheme, called QNing can be notably applied to incremental first-order methods such as the stochastic variance-reduced gradient descent algorithm (SVRG) and other randomized incremental optimization algorithms. QNing is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. When combined with limited-memory BFGS rules, QNing is particularly effective to solve high-dimensional optimization problems, while enjoying a worst-case linear convergence rate for strongly convex problems. We present experimental results where QNing gives significant improvements over competing methods for training machine learning methods on large samples and in high dimensions.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas
Contributor : Julien Mairal <>
Submitted on : Friday, July 20, 2018 - 3:50:30 PM
Last modification on : Friday, July 17, 2020 - 11:40:04 AM
Document(s) archivé(s) le : Sunday, October 21, 2018 - 12:16:19 PM


Files produced by the author(s)


  • HAL Id : hal-01376079, version 3
  • ARXIV : 1610.00960


Hongzhou Lin, Julien Mairal, Zaid Harchaoui. An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration. 2018. ⟨hal-01376079v3⟩



Record views


Files downloads