Skip to Main content Skip to Navigation
New interface
Reports (Research report)

Multi-source shared nearest neighbours for multi-modal image clustering

Abstract : Shared Nearest Neighbours (SNN) techniques are well known to overcome several shortcomings of traditional clustering approaches, notably high dimensionality and metric limitations. However, previous methods were limited to a single information source whereas such methods appear to be very well suited for heterogeneous data, typically in multi-modal contexts. In this paper, we introduce a new multi-source shared neighbours scheme applied to multi-modal image clustering. We first extend existing SNN-based similarity measures to the case of multiple sources and we introduce an original automatic source selection step when building candidate clusters. The key point is that each resulting cluster is built with its own optimal subset of modalities which improves the robustness to noisy or outlier information sources. We experiment our method in the scope of multimodal image search results clustering and show its effectiveness using both synthetic and real data involving different visual and textual information sources and several datasets of the literature.
Document type :
Reports (Research report)
Complete list of metadata

Cited literature [1 references]  Display  Hide  Download
Contributor : Hamzaoui Amel Connect in order to contact the contributor
Submitted on : Monday, July 26, 2010 - 5:35:22 PM
Last modification on : Wednesday, October 26, 2022 - 8:16:19 AM
Long-term archiving on: : Thursday, December 1, 2016 - 9:03:39 PM


Files produced by the author(s)


  • HAL Id : inria-00496170, version 2



Amel Hamzaoui, Alexis Joly, Nozha Boujemaa. Multi-source shared nearest neighbours for multi-modal image clustering. [Research Report] RR-7351, INRIA. 2010. ⟨inria-00496170v2⟩



Record views


Files downloads