Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays

Résumé

The existing analysis of asynchronous stochastic gradient descent (SGD) degrades dramatically when any delay is large, giving the impression that performance depends primarily on the delay. On the contrary, we prove much better guarantees for the same asynchronous SGD algorithm regardless of the delays in the gradients, depending instead just on the number of parallel devices used to implement the algorithm. Our guarantees are strictly better than the existing analyses, and we also argue that asynchronous SGD outperforms synchronous minibatch SGD in the settings we consider. For our analysis, we introduce a novel recursion based on "virtual iterates" and delay-adaptive stepsizes, which allow us to derive state-of-the-art guarantees for both convex and non-convex objectives.

Dates et versions

hal-03867190 , version 1 (23-11-2022)

Identifiants

Citer

Konstantin Mishchenko, Francis Bach, Mathieu Even, Blake Woodworth. Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. NeurIPS 2022 - Thirty-sixth Conference on Neural Information Processing Systems, Nov 2022, New Orleans, United States. ⟨hal-03867190⟩
47 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More