Differentially Private Mixture of Generative Neural Networks - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Knowledge and Data Engineering Année : 2019

Differentially Private Mixture of Generative Neural Networks

Résumé

Generative models are used in a wide range of applications building on large amounts of contextually rich information. Due to possible privacy violations of the individuals whose data is used to train these models, however, publishing or sharing generative models is not always viable. In this paper, we present a novel technique for privately releasing generative models and entire high-dimensional datasets produced by these models. We model the generator distribution of the training data with a mixture of k generative neural networks. These are trained together and collectively learn the generator distribution of a dataset. Data is divided into k clusters, using a novel differentially private kernel k-means, then each cluster is given to separate generative neural networks, such as Restricted Boltzmann Machines or Variational Autoencoders, which are trained only on their own cluster using differentially private gradient descent. We evaluate our approach using the MNIST dataset, as well as call detail records and transit datasets, showing that it produces realistic synthetic samples, which can also be used to accurately compute arbitrary number of counting queries.

Dates et versions

hal-01921923 , version 1 (14-11-2018)

Identifiants

Citer

Gergely Acs, Luca Melis, Claude Castelluccia, Emiliano de Cristofaro. Differentially Private Mixture of Generative Neural Networks. IEEE Transactions on Knowledge and Data Engineering, 2019, 31 (6), pp.1109-1121. ⟨10.1109/TKDE.2018.2855136⟩. ⟨hal-01921923⟩
124 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More