An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration

Abstract : We propose an inexact variable-metric proximal point algorithm to accelerate gradient-based optimization algorithms. The proposed scheme, called QNing can be notably applied to incremental first-order methods such as the stochastic variance-reduced gradient descent algorithm (SVRG) and other randomized incremental optimization algorithms. QNing is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. When combined with limited-memory BFGS rules, QNing is particularly effective to solve high-dimensional optimization problems, while enjoying a worst-case linear convergence rate for strongly convex problems. We present experimental results where QNing gives significant improvements over competing methods for training machine learning methods on large samples and in high dimensions.
Complete list of metadatas

https://hal.inria.fr/hal-01376079
Contributor : Julien Mairal <>
Submitted on : Monday, January 28, 2019 - 10:28:17 AM
Last modification on : Tuesday, June 18, 2019 - 12:03:15 PM
Long-term archiving on : Monday, April 29, 2019 - 2:23:59 PM

Files

quickening_arxiv.pdf
Files produced by the author(s)

Identifiers

Collections

Citation

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration. SIAM Journal on Optimization, Society for Industrial and Applied Mathematics, 2019, 29 (2), pp.1408-1443. ⟨10.1137/17M1125157⟩. ⟨hal-01376079v4⟩

Share

Metrics

Record views

152

Files downloads

583