Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics

Résumé

We analyse the privacy leakage of noisy stochastic gradient descent by modeling Rényi divergence dynamics with Langevin diffusions. Inspired by recent work on non-stochastic algorithms, we derive similar desirable properties in the stochastic setting. In particular, we prove that the privacy loss converges exponentially fast for smooth and strongly convex objectives under constant step size, which is a significant improvement over previous DP-SGD analyses. We also extend our analysis to arbitrary sequences of varying step sizes and derive new utility bounds. Last, we propose an implementation and our experiments show the practical utility of our approach compared to classical DP-SGD libraries.
Fichier principal
Vignette du fichier
_ICML__Langevin_SGD___ICML_Template.pdf (361.29 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03547726 , version 1 (28-01-2022)
hal-03547726 , version 2 (05-02-2022)

Identifiants

  • HAL Id : hal-03547726 , version 2

Citer

Théo Ryffel, Francis Bach, David Pointcheval. Differential Privacy Guarantees for Stochastic Gradient Langevin Dynamics. 2022. ⟨hal-03547726v2⟩
104 Consultations
176 Téléchargements

Partager

Gmail Facebook X LinkedIn More