HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation

Voted-Perceptron based Distance Metric Learning in k Nearest Neighbor

Abstract : This thesis is related to distance metric learning for kNN classification. We use the k nearest neighbor (kNN) which is a well known classical algorithm in machine learning. The contribution of this work lies in using the k nearest neighbor algorithm with the Freund and Schapire’s voted-perceptron algo- rithm combined with its Collins’ incremental variant. The proposed algorithm can work with linear separable as well as non-linear separable data. A vector is learned for each class during the training phase in such a way that the k nearest neighbors belong to the same class. These vectors are subsequently used for classifying unseen examples. The implementation is done in the incremental setting so that the inclusion of new examples does not trigger the training phase for all of the stored examples as in the case of batch learning. A user relevance feedback mechanism is also developed to improve the training data. Experiments are carried out on different datasets and the performance is assessed against state of the art kNN algorithm. Different distance and similarity metrics are used for comparison.
Document type :
Complete list of metadata

Contributor : Marie-Christine Fauvet Connect in order to contact the contributor
Submitted on : Friday, February 28, 2014 - 4:13:21 PM
Last modification on : Thursday, October 21, 2021 - 3:48:25 AM


  • HAL Id : hal-00954107, version 1



Ali Mustafa Qamar. Voted-Perceptron based Distance Metric Learning in k Nearest Neighbor. [Research Report] 2007. ⟨hal-00954107⟩



Record views