A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2017

A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization

Résumé

We propose a generic approach to accelerate gradient-based optimization algorithms with quasi-Newton principles. The proposed scheme, called QuickeNing, can be applied to incremental first-order methods such as stochastic variance-reduced gradient (SVRG) or incremental surrogate optimization (MISO). It is also compatible with composite objectives, meaning that it has the ability to provide exactly sparse solutions when the objective involves a sparsity-inducing regularization. QuickeNing relies on limited-memory BFGS rules, making it appropriate for solving high-dimensional optimization problems. Besides, it enjoys a worst-case linear convergence rate for strongly convex problems. We present experimental results where QuickeNing gives significant improvements over competing methods for solving large-scale high-dimensional machine learning problems.
Fichier principal
Vignette du fichier
quickening_arxiv.pdf (486.68 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01376079 , version 1 (04-10-2016)
hal-01376079 , version 2 (05-04-2017)
hal-01376079 , version 3 (20-07-2018)
hal-01376079 , version 4 (28-01-2019)

Identifiants

Citer

Hongzhou Lin, Julien Mairal, Zaid Harchaoui. A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization. 2017. ⟨hal-01376079v2⟩
1108 Consultations
1357 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More