Gossip Learning as a Decentralized Alternative to Federated Learning - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Gossip Learning as a Decentralized Alternative to Federated Learning

Résumé

Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.
Fichier principal
Vignette du fichier
485766_1_En_5_Chapter.pdf (1.92 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02319574 , version 1 (18-10-2019)

Licence

Paternité

Identifiants

Citer

István Hegedűs, Gábor Danner, Márk Jelasity. Gossip Learning as a Decentralized Alternative to Federated Learning. 19th IFIP International Conference on Distributed Applications and Interoperable Systems (DAIS), Jun 2019, Kongens Lyngby, Denmark. pp.74-90, ⟨10.1007/978-3-030-22496-7_5⟩. ⟨hal-02319574⟩
1226 Consultations
164 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More