A bumpy journey: exploring deep Gaussian mixture models - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

A bumpy journey: exploring deep Gaussian mixture models

Résumé

The deep Gaussian mixture model (DGMM) is a framework directly inspired by the finite mixture of factor analysers model (MFA) and the deep learning architecture composed of multiple layers. The MFA is a generative model that considers a data point as arising from a latent variable (termed the score) which is sampled from a standard multivariate Gaussian distribution and then transformed linearly. The linear transformation matrix (termed the loading matrix) is specific to a component in the finite mixture. The DGMM consists of stacking MFA layers, in the sense that the latent scores are no longer assumed to be drawn from a standard Gaussian, but rather are drawn from a mixture of factor analysers model. Thus the latent scores are at one point considered to be the input of an MFA and also to have latent scores themselves. The latent scores of the DGMM's last layer only are considered to be drawn from a standard multivariate Gaussian distribution. In recent years, the DGMM has gained prominence in the literature: intuitively, this model should be able to capture complex distributions more precisely than a simple Gaussian mixture model. We show in this work that while the DGMM is an original and novel idea, in certain cases it is challenging to infer its parameters. In addition, we give some insights to the probable reasons of this difficulty. Experimental results are provided on github: https://github.com/ansubmissions/ICBINB, alongside an R package that implements the algorithm and a number of ready-to-run R scripts.
Fichier principal
Vignette du fichier
ICBINB_hal.pdf (312.67 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02985701 , version 1 (02-11-2020)

Identifiants

  • HAL Id : hal-02985701 , version 1

Citer

Margot Selosse, Isobel Claire Gormley, Julien Jacques, Christophe Biernacki. A bumpy journey: exploring deep Gaussian mixture models. I Can't Believe It's Not Better @ NeurIPS 2020, Dec 2020, Vancouver, Canada. ⟨hal-02985701⟩
211 Consultations
641 Téléchargements

Partager

Gmail Facebook X LinkedIn More