Skip to Main content Skip to Navigation
Conference papers

Asaga: Asynchronous Parallel Saga

Rémi Leblond 1 Fabian Pedregosa 1 Simon Lacoste-Julien 2 
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We describe Asaga, an asynchronous parallel version of the incremental gradient algorithm Saga that enjoys fast linear convergence rates. Through a novel perspective, we revisit and clarify a subtle but important technical issue present in a large fraction of the recent convergence rate proofs for asynchronous parallel optimization algorithms, and propose a simplification of the recently introduced " perturbed iterate " framework that resolves it. We thereby prove that Asaga can obtain a theoretical linear speedup on multi-core systems even without sparsity assumptions. We present results of an implementation on a 40-core architecture illustrating the practical speedup as well as the hardware overhead.
Complete list of metadata

Cited literature [21 references]  Display  Hide  Download
Contributor : Rémi Leblond Connect in order to contact the contributor
Submitted on : Friday, December 15, 2017 - 4:13:47 PM
Last modification on : Wednesday, June 8, 2022 - 12:50:05 PM


Files produced by the author(s)


  • HAL Id : hal-01665255, version 1



Rémi Leblond, Fabian Pedregosa, Simon Lacoste-Julien. Asaga: Asynchronous Parallel Saga. 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Apr 2017, Fort Lauderdale, Florida, United States. ⟨hal-01665255⟩



Record views


Files downloads