Self-organizing mixture models - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Journal Articles Neurocomputing Year : 2005

Self-organizing mixture models

Jakob Verbeek
Nikos Vlassis
  • Function : Author
  • PersonId : 853678

Abstract

We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data.
Fichier principal
Vignette du fichier
verbeek03neuro_final.pdf (859.02 Ko) Télécharger le fichier
Vignette du fichier
VVK05.png (52.38 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Format : Figure, Image
Loading...

Dates and versions

inria-00321479 , version 1 (16-02-2011)

Identifiers

Cite

Jakob Verbeek, Nikos Vlassis, Ben Krose. Self-organizing mixture models. Neurocomputing, 2005, 63, pp.99--123. ⟨10.1016/j.neucom.2004.04.008⟩. ⟨inria-00321479⟩
184 View
1152 Download

Altmetric

Share

Gmail Facebook X LinkedIn More