Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

Abstract : We propose a generic approach to accelerate gradient-based optimization algorithms with quasi-Newton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [47 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01376079
Contributor : Julien Mairal <>
Submitted on : Wednesday, April 5, 2017 - 1:47:15 PM
Last modification on : Friday, July 17, 2020 - 11:40:04 AM
Document(s) archivé(s) le : Thursday, July 6, 2017 - 1:15:50 PM

Files

quickening_arxiv.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01376079, version 2
  • ARXIV : 1610.00960

Collections

Citation

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization. 2017. ⟨hal-01376079v2⟩

Share

Metrics

Record views

806

Files downloads

625