HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Journal articles

Self-organizing mixture models

Abstract : We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data.
Document type :
Journal articles
Complete list of metadata

Cited literature [28 references]  Display  Hide  Download

Contributor : Jakob Verbeek Connect in order to contact the contributor
Submitted on : Wednesday, February 16, 2011 - 4:22:52 PM
Last modification on : Friday, April 12, 2019 - 3:58:23 PM
Long-term archiving on: : Tuesday, May 17, 2011 - 2:25:46 AM


Files produced by the author(s)



Jakob Verbeek, Nikos Vlassis, Ben Krose. Self-organizing mixture models. Neurocomputing, Elsevier, 2005, 63, pp.99--123. ⟨10.1016/j.neucom.2004.04.008⟩. ⟨inria-00321479⟩



Record views


Files downloads