Self-organizing mixture models

Abstract : We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data.
Document type :
Journal articles
Complete list of metadatas

Cited literature [28 references]  Display  Hide  Download


https://hal.inria.fr/inria-00321479
Contributor : Jakob Verbeek <>
Submitted on : Wednesday, February 16, 2011 - 4:22:52 PM
Last modification on : Friday, April 12, 2019 - 3:58:23 PM
Long-term archiving on : Tuesday, May 17, 2011 - 2:25:46 AM

Files

verbeek03neuro_final.pdf
Files produced by the author(s)

Identifiers

Citation

Jakob Verbeek, Nikos Vlassis, Ben Krose. Self-organizing mixture models. Neurocomputing, Elsevier, 2005, 63, pp.99--123. ⟨10.1016/j.neucom.2004.04.008⟩. ⟨inria-00321479⟩

Share

Metrics

Record views

370

Files downloads

1248