Skip to Main content Skip to Navigation
Reports

Voted-Perceptron based Distance Metric Learning in k Nearest Neighbor

Abstract : This thesis is related to distance metric learning for kNN classification. We use the k nearest neighbor (kNN) which is a well known classical algorithm in machine learning. The contribution of this work lies in using the k nearest neighbor algorithm with the Freund and Schapire’s voted-perceptron algo- rithm combined with its Collins’ incremental variant. The proposed algorithm can work with linear separable as well as non-linear separable data. A vector is learned for each class during the training phase in such a way that the k nearest neighbors belong to the same class. These vectors are subsequently used for classifying unseen examples. The implementation is done in the incremental setting so that the inclusion of new examples does not trigger the training phase for all of the stored examples as in the case of batch learning. A user relevance feedback mechanism is also developed to improve the training data. Experiments are carried out on different datasets and the performance is assessed against state of the art kNN algorithm. Different distance and similarity metrics are used for comparison.
Document type :
Reports
Complete list of metadata

https://hal.inria.fr/hal-00954107
Contributor : Marie-Christine Fauvet <>
Submitted on : Friday, February 28, 2014 - 4:13:21 PM
Last modification on : Tuesday, December 8, 2020 - 10:42:35 AM

Identifiers

  • HAL Id : hal-00954107, version 1

Collections

Citation

Ali Mustafa Qamar. Voted-Perceptron based Distance Metric Learning in k Nearest Neighbor. [Research Report] 2007. ⟨hal-00954107⟩

Share

Metrics

Record views

292