Gaussian mixture learning from noisy data

Abstract : We address the problem of learning a Gaussian mixture from a set of noisy data points. Each input point has an associated covariance matrix that can be interpreted as the uncertainty by which this point was observed. We derive an EM algorithm that learns a Gaussian mixture that minimizes the Kullback-Leibler divergence to a variable kernel density estimator on the input data. The proposed algorithm performs iterative optimization of a strict bound on the Kullback-Leibler divergence, and is provably convergent.
Document type :
Reports
Complete list of metadatas

Cited literature [9 references]  Display  Hide  Download


https://hal.inria.fr/inria-00321483
Contributor : Jakob Verbeek <>
Submitted on : Tuesday, April 5, 2011 - 2:56:51 PM
Last modification on : Monday, September 25, 2017 - 10:08:04 AM
Long-term archiving on : Wednesday, July 6, 2011 - 2:57:58 AM

Files

verbeek04tr.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00321483, version 2

Collections

Citation

Nikos Vlassis, Jakob Verbeek. Gaussian mixture learning from noisy data. [Technical Report] IAS-UVA-04, 2004, pp.6. ⟨inria-00321483v2⟩

Share

Metrics

Record views

284

Files downloads

259