A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

Abstract : We propose a generic approach to accelerate gradient-based optimization algorithms with quasi-Newton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.
Type de document :
Pré-publication, Document de travail
2017
Liste complète des métadonnées


https://hal.inria.fr/hal-01376079
Contributeur : Julien Mairal <>
Soumis le : mercredi 5 avril 2017 - 13:47:15
Dernière modification le : jeudi 15 juin 2017 - 09:09:24

Fichiers

quickening_arxiv.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01376079, version 2
  • ARXIV : 1610.00960

Collections

Citation

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization. 2017. <hal-01376079v2>

Partager

Métriques

Consultations de
la notice

386

Téléchargements du document

223