Skip to Main content Skip to Navigation
Conference papers

On the Convergence of Single-Call Stochastic Extra-Gradient Methods

Abstract : Owing to their stability and convergence speed, extragradient methods have become a staple for solving large-scale saddle-point problems in machine learning. The basic premise of these algorithms is the use of an extrapolation step before performing an update; thanks to this exploration step, extragradient methods overcome many of the non-convergence issues that plague gradient descent/ascent schemes. On the other hand, as we show in this paper, running vanilla extragradient with stochastic gradients may jeopardize its convergence, even in simple bilinear models. To overcome this failure, we investigate a double stepsize extragradient algorithm where the exploration step evolves at a more aggressive timescale compared to the update step. We show that this modification allows the method to converge even with stochastic gradients, and we derive sharp convergence rates under an error bound condition.
Document type :
Conference papers
Complete list of metadata

Cited literature [30 references]  Display  Hide  Download

https://hal.inria.fr/hal-02403555
Contributor : Panayotis Mertikopoulos <>
Submitted on : Tuesday, December 10, 2019 - 9:15:39 PM
Last modification on : Tuesday, May 11, 2021 - 11:37:50 AM
Long-term archiving on: : Wednesday, March 11, 2020 - 6:02:48 PM

File

SingleCall-NIPS.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02403555, version 1
  • ARXIV : 1908.08465

Citation

Yu-Guan Hsieh, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos. On the Convergence of Single-Call Stochastic Extra-Gradient Methods. Advances in Neural Information Processing Systems, Dec 2019, Vancouver, Canada. ⟨hal-02403555⟩

Share

Metrics

Record views

165

Files downloads

717