Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

QuickeNing: A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

Abstract : We propose an approach to accelerate gradient-based optimization algorithms by giving them the ability to exploit curvature information using quasi-Newton update rules. The proposed scheme, called QuickeNing, is generic and can be applied to a large class of first-order methods such as incremental and block-coordinate algorithms; it is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems; with no line-search, it is also simple to use and to implement. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [44 references]  Display  Hide  Download
Contributor : Julien Mairal <>
Submitted on : Tuesday, October 4, 2016 - 11:10:01 AM
Last modification on : Friday, July 17, 2020 - 11:40:04 AM
Document(s) archivé(s) le : Friday, February 3, 2017 - 3:30:59 PM


Files produced by the author(s)


  • HAL Id : hal-01376079, version 1
  • ARXIV : 1610.00960


Hongzhou Lin, Julien Mairal, Zaid Harchaoui. QuickeNing: A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization. 2016. ⟨hal-01376079v1⟩



Record views


Files downloads