A variational EM algorithm for large-scale mixture modeling

Abstract : Mixture densities constitute a rich family of models that can be used in several data mining and machine learning applications, for instance, clustering. Although practical algorithms exist for learning such models from data, these algorithms typically do not scale very well with large datasets. Our approach, which builds on previous work by other authors, offers an acceleration of the EM algorithm for Gaussian mixtures by precomputing and storing sufficient statistics of the data in the nodes of a kd-tree. Contrary to other works, we obtain algorithms that strictly increase a lower bound on the data log-likelihood in every learning step. Experimental results illustrate the validity of our approach.
Document type :
Conference papers
Complete list of metadatas

Cited literature [14 references]  Display  Hide  Download


https://hal.inria.fr/inria-00321486
Contributor : Jakob Verbeek <>
Submitted on : Tuesday, March 8, 2011 - 3:08:17 PM
Last modification on : Monday, September 25, 2017 - 10:08:04 AM
Long-term archiving on : Thursday, June 9, 2011 - 2:46:00 AM

Files

verbeek03asci2.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00321486, version 2

Citation

Jakob Verbeek, Nikos Vlassis, Jan Nunnink. A variational EM algorithm for large-scale mixture modeling. 9th Annual Conference of the Advanced School for Computing and Imaging (ASCI '03), Jun 2003, Heijen, Netherlands. pp.136--143. ⟨inria-00321486v2⟩

Share

Metrics

Record views

366

Files downloads

415