Self-organizing mixture models

Abstract : We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data.
Type de document :
Article dans une revue
Neurocomputing / EEG Neurocomputing, Elsevier, 2005, 63, pp.99--123. 〈10.1016/j.neucom.2004.04.008〉
Liste complète des métadonnées

Littérature citée [28 références]  Voir  Masquer  Télécharger


https://hal.inria.fr/inria-00321479
Contributeur : Jakob Verbeek <>
Soumis le : mercredi 16 février 2011 - 16:22:52
Dernière modification le : lundi 25 septembre 2017 - 10:08:04
Document(s) archivé(s) le : mardi 17 mai 2011 - 02:25:46

Fichiers

verbeek03neuro_final.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Jakob Verbeek, Nikos Vlassis, Ben Krose. Self-organizing mixture models. Neurocomputing / EEG Neurocomputing, Elsevier, 2005, 63, pp.99--123. 〈10.1016/j.neucom.2004.04.008〉. 〈inria-00321479〉

Partager

Métriques

Consultations de la notice

326

Téléchargements de fichiers

883