Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2013

Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition

Résumé

We consider optimizing a function smooth convex function $f$ that is the average of a set of differentiable functions $f_i$, under the assumption considered by Solodov [1998] and Tseng [1998] that the norm of each gradient $f_i'$ is bounded by a linear function of the norm of the average gradient $f'$. We show that under these assumptions the basic stochastic gradient method with a sufficiently-small constant step-size has an $O(1/k)$ convergence rate, and has a linear convergence rate if $g$ is strongly-convex.
Fichier principal
Vignette du fichier
smallResidual.pdf (79.99 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00855113 , version 1 (28-08-2013)

Identifiants

Citer

Mark Schmidt, Nicolas Le Roux. Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition. 2013. ⟨hal-00855113⟩
218 Consultations
1238 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More