Skip to Main content Skip to Navigation
Conference papers

Asaga: Asynchronous Parallel Saga

Rémi Leblond 1 Fabian Pedregosa 1 Simon Lacoste-Julien 2
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We describe Asaga, an asynchronous parallel version of the incremental gradient algorithm Saga that enjoys fast linear convergence rates. Through a novel perspective, we revisit and clarify a subtle but important technical issue present in a large fraction of the recent convergence rate proofs for asynchronous parallel optimization algorithms, and propose a simplification of the recently introduced " perturbed iterate " framework that resolves it. We thereby prove that Asaga can obtain a theoretical linear speedup on multi-core systems even without sparsity assumptions. We present results of an implementation on a 40-core architecture illustrating the practical speedup as well as the hardware overhead.
Complete list of metadata

Cited literature [21 references]  Display  Hide  Download

https://hal.inria.fr/hal-01665255
Contributor : Rémi Leblond <>
Submitted on : Friday, December 15, 2017 - 4:13:47 PM
Last modification on : Tuesday, May 4, 2021 - 2:06:02 PM

File

leblond17a.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01665255, version 1

Collections

Citation

Rémi Leblond, Fabian Pedregosa, Simon Lacoste-Julien. Asaga: Asynchronous Parallel Saga. 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Apr 2017, Fort Lauderdale, Florida, United States. ⟨hal-01665255⟩

Share

Metrics

Record views

552

Files downloads

86