Skip to Main content Skip to Navigation
Theses

Efficient and privacy-preserving compressive learning

Abstract : The topic of this Ph.D. thesis lies on the borderline between signal processing, statistics and computer science. It mainly focuses on compressive learning, a paradigm for large-scale machine learning in which the whole dataset is compressed down to a single vector of randomized generalized moments, called the sketch. An approximate solution of the learning task at hand is then estimated from this sketch, without using the initial data. This framework is by nature suited for learning from distributed collections or data streams, and has already been instantiated with success on several unsupervised learning tasks such as k-means clustering, density fitting using Gaussian mixture models, or principal component analysis. We improve this framework in multiple directions. First, it is shown that perturbing the sketch with additive noise is sufficient to derive (differential) privacy guarantees. Sharp bounds on the noise level required to obtain a given privacy level are provided, and the proposed method is shown empirically to compare favourably with state-of-the-art techniques. Then, the compression scheme is modified to leverage structured random matrices, which reduce the computational cost of the framework and make it possible to learn on high-dimensional data. Lastly, we introduce a new algorithm based on message passing techniques to learn from the sketch for the k-means clustering problem. These contributions open the way for a broader application of the framework.
Complete list of metadatas

https://tel.archives-ouvertes.fr/tel-03023287
Contributor : Abes Star :  Contact
Submitted on : Friday, January 8, 2021 - 11:01:07 AM
Last modification on : Thursday, January 14, 2021 - 4:32:14 PM

File

CHATALIC_Antoine.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-03023287, version 2

Citation

Antoine Chatalic. Efficient and privacy-preserving compressive learning. Machine Learning [cs.LG]. Université Rennes 1, 2020. English. ⟨NNT : 2020REN1S030⟩. ⟨tel-03023287v2⟩

Share

Metrics

Record views

53

Files downloads

19