Skip to Main content Skip to Navigation
Journal articles

An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration

Abstract : We propose an inexact variable-metric proximal point algorithm to accelerate gradient-based optimization algorithms. The proposed scheme, called QNing can be notably applied to incremental first-order methods such as the stochastic variance-reduced gradient descent algorithm (SVRG) and other randomized incremental optimization algorithms. QNing is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. When combined with limited-memory BFGS rules, QNing is particularly effective to solve high-dimensional optimization problems, while enjoying a worst-case linear convergence rate for strongly convex problems. We present experimental results where QNing gives significant improvements over competing methods for training machine learning methods on large samples and in high dimensions.
Complete list of metadata
Contributor : Julien Mairal Connect in order to contact the contributor
Submitted on : Monday, January 28, 2019 - 10:28:17 AM
Last modification on : Thursday, January 20, 2022 - 5:28:14 PM
Long-term archiving on: : Monday, April 29, 2019 - 2:23:59 PM


Files produced by the author(s)




Hongzhou Lin, Julien Mairal, Zaid Harchaoui. An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration. SIAM Journal on Optimization, Society for Industrial and Applied Mathematics, 2019, 29 (2), pp.1408-1443. ⟨10.1137/17M1125157⟩. ⟨hal-01376079v4⟩



Les métriques sont temporairement indisponibles