Skip to Main content Skip to Navigation
Conference papers

Fully Distributed Privacy Preserving Mini-batch Gradient Descent Learning

Abstract : In fully distributed machine learning, privacy and security are important issues. These issues are often dealt with using secure multiparty computation (MPC). However, in our application domain, known MPC algorithms are not scalable or not robust enough. We propose a light-weight protocol to quickly and securely compute the sum of the inputs of a subset of participants assuming a semi-honest adversary. During the computation the participants learn no individual values. We apply this protocol to efficiently calculate the sum of gradients as part of a fully distributed mini-batch stochastic gradient descent algorithm. The protocol achieves scalability and robustness by exploiting the fact that in this application domain a “quick and dirty” sum computation is acceptable. In other words, speed and robustness takes precedence over precision. We analyze the protocol theoretically as well as experimentally based on churn statistics from a real smartphone trace. We derive a sufficient condition for preventing the leakage of an individual value, and we demonstrate the feasibility of the overhead of the protocol.
Complete list of metadata

Cited literature [24 references]  Display  Hide  Download
Contributor : Hal Ifip Connect in order to contact the contributor
Submitted on : Tuesday, April 24, 2018 - 11:53:22 AM
Last modification on : Tuesday, April 24, 2018 - 12:42:52 PM
Long-term archiving on: : Wednesday, September 19, 2018 - 8:13:07 AM


Files produced by the author(s)


Distributed under a Creative Commons Attribution 4.0 International License



Gábor Danner, Márk Jelasity. Fully Distributed Privacy Preserving Mini-batch Gradient Descent Learning. 15th IFIP International Conference on Distributed Applications and Interoperable Systems (DAIS), Jun 2015, Grenoble, France. pp.30-44, ⟨10.1007/978-3-319-19129-4_3⟩. ⟨hal-01775029⟩



Record views


Files downloads