Multi-source shared nearest neighbours for multi-modal image clustering

  • Authors:
  • Amel Hamzaoui;Alexis Joly;Nozha Boujemaa

  • Affiliations:
  • INRIA-Rocquencourt, team-project:IMEDIA, Le Chesnay Cedex, France 78153;INRIA-Rocquencourt, team-project:IMEDIA, Le Chesnay Cedex, France 78153;INRIA-Rocquencourt, team-project:IMEDIA, Le Chesnay Cedex, France 78153

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Shared Nearest Neighbours (SNN) techniques are well known to overcome several shortcomings of traditional clustering approaches, notably high dimensionality and metric limitations. However, previous methods were limited to a single information source whereas such methods appear to be very well suited for heterogeneous data, typically in multi-modal contexts. In this paper, we propose a new technique to accelerate the calculation of shared neighbours and we introduce a new multi-source shared neighbours scheme applied to multi-modal image clustering. We first extend existing SNN-based similarity measures to the case of multiple sources and we introduce an original automatic source selection step when building candidate clusters. The key point is that each resulting cluster is built with its own optimal subset of modalities which improves the robustness to noisy or outlier information sources. We experiment our method in the scope of multi-modal search result clustering, visual search mining and subspace clustering. Experimental results on both synthetic and real data involving different information sources and several datasets show the effectiveness of our method.