Skip to Main content Skip to Navigation
Journal articles

A Bayesian reassessment of nearest-neighbour classification

Abstract : The k-nearest-neighbor (knn) procedure is a well-known deterministic method used in supervised classification. This article proposes a reassessment of this approach as a statistical technique derived from a proper probabilistic model; in particular, we modify the assessment found in Holmes and Adams, and evaluated by Manocha and Girolami, where the underlying probabilistic model is not completely well defined. Once provided with a clear probabilistic basis for the knn procedure, we derive computational tools for Bayesian inference on the parameters of the corresponding model. In particular, we assess the difficulties inherent to both pseudo-likelihood and path sampling approximations of an intractable normalizing constant. We implement a correct MCMC sampler based on perfect sampling. When perfect sampling is not available, we use instead a Gibbs sampling approximation. Illustrations of the performance of the corresponding Bayesian classifier are provided for benchmark datasets, demonstrating in particular the limitations of the pseudo-likelihood approximation in this set up.
Complete list of metadata

Cited literature [36 references]  Display  Hide  Download
Contributor : Jean-Michel Marin Connect in order to contact the contributor
Submitted on : Monday, March 3, 2008 - 1:51:47 PM
Last modification on : Friday, August 5, 2022 - 2:49:41 PM
Long-term archiving on: : Friday, November 25, 2016 - 10:56:54 PM


Files produced by the author(s)



Lionel Cucala, Jean-Michel Marin, Christian Robert, Mike Titterington. A Bayesian reassessment of nearest-neighbour classification. Journal of the American Statistical Association, Taylor & Francis, 2009, 104 (485), pp.263-273. ⟨10.1198/jasa.2009.0125⟩. ⟨inria-00143783v4⟩



Record views


Files downloads