A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2012

A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets

Résumé

We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex. While standard stochastic gradient methods converge at sublinear rates for this problem, the proposed method incorporates a memory of previous gradient values in order to achieve a linear convergence rate. Numerical experiments indicate that the new algorithm can dramatically outperform standard algorithms.
Fichier principal
Vignette du fichier
sag_arxiv.pdf (511.06 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-00674995 , version 1 (28-02-2012)
hal-00674995 , version 2 (05-07-2012)
hal-00674995 , version 3 (06-07-2012)
hal-00674995 , version 4 (11-03-2013)

Identifiants

Citer

Nicolas Le Roux, Mark Schmidt, Francis Bach. A Stochastic Gradient Method with an Exponential Convergence Rate for Strongly-Convex Optimization with Finite Training Sets. 2012. ⟨hal-00674995v1⟩
655 Consultations
5195 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More