Sketching for Large-Scale Learning of Mixture Models - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

Sketching for Large-Scale Learning of Mixture Models

Résumé

Learning parameters from voluminous data can be prohibitive in terms of memory and computational requirements. We propose a "compressive learning'' framework where we first sketch the data by computing random generalized moments of the underlying probability distribution, then estimate mixture model parameters from the sketch using an iterative algorithm analogous to greedy sparse signal recovery. We exemplify our framework with the sketched estimation of Gaussian Mixture Models (GMMs). We experimentally show that our approach yields results comparable to the classical Expectation-Maximization (EM) technique while requiring significantly less memory and fewer computations when the number of database elements is large. We report large-scale experiments in speaker verification, where our approach makes it possible to fully exploit a corpus of 1000 hours of speech signal to learn a universal background model at scales computationally inaccessible to EM.
Fichier principal
Vignette du fichier
paper.pdf (611.69 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01208027 , version 1 (01-10-2015)
hal-01208027 , version 2 (23-10-2015)
hal-01208027 , version 3 (01-03-2016)

Identifiants

  • HAL Id : hal-01208027 , version 2

Citer

Nicolas Keriven, Anthony Bourrier, Rémi Gribonval, Patrick Pérez. Sketching for Large-Scale Learning of Mixture Models. 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2016), Mar 2016, Shanghai, China. ⟨hal-01208027v2⟩
952 Consultations
895 Téléchargements

Partager

Gmail Facebook X LinkedIn More