Scalable indexing for perceptual data

  • Authors:
  • Arun Qamra;Edward Y. Chang

  • Affiliations:
  • Dept of Computer Science, University of California Santa Barbara;Google Research

  • Venue:
  • MCAM'07 Proceedings of the 2007 international conference on Multimedia content analysis and mining
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In recent years, multimedia objects such as images, video, and audio are becoming increasingly widespread. Many applications require content-based retrieval to be performed, and measurement of distance is a key component in such scenarios. The nature of multimedia requires perceptual similarity to be captured when computing distance between objects. Measures such as the Euclidean distance, which utilize all attributes of a pair of objects, do not perform very well. Instead, distance measures that use partial matches between objects have been found to perform significantly better. This is because, two multimedia objects can be considered perceptually similar when some respects closely match, even when they are very different in other respects. Existing distance measures that capture partial similarity have limitations, such as their non-metric nature, which makes scalable indexing challenging. In this paper, we propose the Partial Match Function, a distance measure that performs well for perceptual data, and allows efficient indexing.