Variance Reduced Stochastic Gradient Descent with Neighbors

Thomas Hofmann 1 Aurelien Lucchi 1 Simon Lacoste-Julien 2, 3, 4 Brian Mcwilliams 1
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : Stochastic Gradient Descent (SGD) is a workhorse in machine learning, yet its slow convergence can be a computational bottleneck. Variance reduction techniques such as SAG, SVRG and SAGA have been proposed to overcome this weakness, achieving linear convergence. However, these methods are either based on computations of full gradients at pivot points, or on keeping per data point corrections in memory. Therefore speed-ups relative to SGD may need a minimal number of epochs in order to materialize. This paper investigates algorithms that can exploit neighborhood structure in the training data to share and re-use information about past stochastic gradients across data points, which offers advantages in the transient optimization phase. As a side-product we provide a unified convergence analysis for a family of variance reduction algorithms, which we call memorization algorithms. We provide experimental results supporting our theory.
Type de document :
Communication dans un congrès
NIPS 2015 - Advances in Neural Information Processing Systems 28, Dec 2015, Montreal, Canada. 〈https://papers.nips.cc/book/advances-in-neural-information-processing-systems-28-2015〉
Liste complète des métadonnées

https://hal.inria.fr/hal-01248672
Contributeur : Simon Lacoste-Julien <>
Soumis le : lundi 28 décembre 2015 - 04:30:01
Dernière modification le : samedi 20 octobre 2018 - 15:06:01

Lien texte intégral

Identifiants

  • HAL Id : hal-01248672, version 1
  • ARXIV : 1506.03662

Collections

Citation

Thomas Hofmann, Aurelien Lucchi, Simon Lacoste-Julien, Brian Mcwilliams. Variance Reduced Stochastic Gradient Descent with Neighbors. NIPS 2015 - Advances in Neural Information Processing Systems 28, Dec 2015, Montreal, Canada. 〈https://papers.nips.cc/book/advances-in-neural-information-processing-systems-28-2015〉. 〈hal-01248672〉

Partager

Métriques

Consultations de la notice

347