Asaga: Asynchronous Parallel Saga

Rémi Leblond 1 Fabian Pedregosa 1 Simon Lacoste-Julien 2
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We describe Asaga, an asynchronous parallel version of the incremental gradient algorithm Saga that enjoys fast linear convergence rates. Through a novel perspective, we revisit and clarify a subtle but important technical issue present in a large fraction of the recent convergence rate proofs for asynchronous parallel optimization algorithms, and propose a simplification of the recently introduced " perturbed iterate " framework that resolves it. We thereby prove that Asaga can obtain a theoretical linear speedup on multi-core systems even without sparsity assumptions. We present results of an implementation on a 40-core architecture illustrating the practical speedup as well as the hardware overhead.
Type de document :
Communication dans un congrès
20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Apr 2017, Fort Lauderdale, Florida, United States
Liste complète des métadonnées

Littérature citée [21 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01665255
Contributeur : Rémi Leblond <>
Soumis le : vendredi 15 décembre 2017 - 16:13:47
Dernière modification le : jeudi 26 avril 2018 - 10:29:04

Fichier

leblond17a.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01665255, version 1

Collections

Citation

Rémi Leblond, Fabian Pedregosa, Simon Lacoste-Julien. Asaga: Asynchronous Parallel Saga. 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Apr 2017, Fort Lauderdale, Florida, United States. 〈hal-01665255〉

Partager

Métriques

Consultations de la notice

287

Téléchargements de fichiers

24