Skip to Main content Skip to Navigation
Reports

Multi-source shared nearest neighbours for multi-modal image clustering

Abstract : Shared Nearest Neighbours (SNN) techniques are well known to overcome several shortcomings of traditional clustering approaches, notably high dimensionality and metric limitations. However, previous methods were limited to a single information source whereas such methods appear to be very well suited for heterogeneous data, typically in multi-modal contexts. In this paper, we introduce a new multi-source shared neighbours scheme applied to multi-modal image clustering. We first extend existing SNN-based similarity measures to the case of multiple sources and we introduce an original automatic source selection step when building candidate clusters. The key point is that each resulting cluster is built with its own optimal subset of modalities which improves the robustness to noisy or outlier information sources. We experiment our method in the scope of multimodal image search results clustering and show its effectiveness using both synthetic and real data involving different visual and textual information sources and several datasets of the literature.
Complete list of metadatas

Cited literature [1 references]  Display  Hide  Download

https://hal.inria.fr/inria-00496170
Contributor : Hamzaoui Amel <>
Submitted on : Monday, July 26, 2010 - 5:35:22 PM
Last modification on : Tuesday, June 18, 2019 - 3:06:04 PM
Long-term archiving on: : Thursday, December 1, 2016 - 9:03:39 PM

File

RR-7351.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00496170, version 2

Collections

Citation

Amel Hamzaoui, Alexis Joly, Nozha Boujemaa. Multi-source shared nearest neighbours for multi-modal image clustering. [Research Report] RR-7351, INRIA. 2010. ⟨inria-00496170v2⟩

Share

Metrics

Record views

333

Files downloads

596