Skip to Main content Skip to Navigation
Conference papers

Accelerated decentralized optimization with local updates for smooth and strongly convex objectives

Hadrien Hendrikx 1, 2, 3, 4 Francis Bach 3, 4 Laurent Massoulié 2, 1, 4
2 DYOGENE - Dynamics of Geometric Networks
Inria de Paris, CNRS - Centre National de la Recherche Scientifique : UMR 8548, DI-ENS - Département d'informatique - ENS Paris
3 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm ESDACD, a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number κ of the local functions as well as the network topol-ogy and delays. Under mild assumptions on the topology of the graph, ESDACD takes a time O((τmax + ∆max) κ/γ ln(−1)) to reach a precision where γ is the spectral gap of the graph, τmax the maximum communication delay and ∆max the maximum computation time. Therefore, it matches the rate of SSDA (Scaman et al., 2017), which is optimal when τmax = Ω (∆max). Applying ESDACD to quadratic local functions leads to an accelerated randomized gossip algorithm of rate O(θ gossip /n) where θ gossip is the rate of the standard randomized gossip (Boyd et al., 2006). To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.
Complete list of metadata

Cited literature [6 references]  Display  Hide  Download
Contributor : Hadrien Hendrikx Connect in order to contact the contributor
Submitted on : Thursday, January 16, 2020 - 10:54:53 AM
Last modification on : Tuesday, January 11, 2022 - 11:16:06 AM
Long-term archiving on: : Friday, April 17, 2020 - 1:51:09 PM


Files produced by the author(s)


  • HAL Id : hal-01893568, version 2



Hadrien Hendrikx, Francis Bach, Laurent Massoulié. Accelerated decentralized optimization with local updates for smooth and strongly convex objectives. AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics, Apr 2019, Naha, Okinawa, Japan. ⟨hal-01893568v2⟩



Les métriques sont temporairement indisponibles