Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition

Mark Schmidt 1, 2 Nicolas Le Roux 1, 2 
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : We consider optimizing a function smooth convex function $f$ that is the average of a set of differentiable functions $f_i$, under the assumption considered by Solodov [1998] and Tseng [1998] that the norm of each gradient $f_i'$ is bounded by a linear function of the norm of the average gradient $f'$. We show that under these assumptions the basic stochastic gradient method with a sufficiently-small constant step-size has an $O(1/k)$ convergence rate, and has a linear convergence rate if $g$ is strongly-convex.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

Cited literature [7 references]  Display  Hide  Download

https://hal.inria.fr/hal-00855113
Contributor : Mark Schmidt Connect in order to contact the contributor
Submitted on : Wednesday, August 28, 2013 - 9:20:24 PM
Last modification on : Thursday, March 17, 2022 - 10:08:44 AM
Long-term archiving on: : Monday, December 2, 2013 - 8:52:11 AM

Files

smallResidual.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00855113, version 1
  • ARXIV : 1308.6370

Collections

Citation

Mark Schmidt, Nicolas Le Roux. Fast Convergence of Stochastic Gradient Descent under a Strong Growth Condition. 2013. ⟨hal-00855113⟩

Share

Metrics

Record views

206

Files downloads

1166