Optimizing multimedia retrieval using multimodal fusion and relevance feedback techniques

  • Authors:
  • Apostolos Axenopoulos;Stavroula Manolopoulou;Petros Daras

  • Affiliations:
  • Centre for Research and Technology Hellas, Informatics and Telematics Institute, Thessaloniki, Greece;Centre for Research and Technology Hellas, Informatics and Telematics Institute, Thessaloniki, Greece;Centre for Research and Technology Hellas, Informatics and Telematics Institute, Thessaloniki, Greece

  • Venue:
  • MMM'12 Proceedings of the 18th international conference on Advances in Multimedia Modeling
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces a novel approach for search and retrieval of multimedia content. The proposed framework retrieves multiple media types simultaneously, namely 3D objects, 2D images and audio files, by utilizing an appropriately modified manifold learning algorithm. The latter, which is based on Laplacian Eigenmaps, is able to map the mono-modal low-level descriptors of the different modalities into a new low-dimensional multimodal feature space. In order to accelerate search and retrieval and make the framework suitable even for large-scale applications, a new multimedia indexing scheme is adopted. The retrieval accuracy of the proposed method is further improved through relevance feedback, which enables users to refine their queries by marking the retrieved results as relevant or non-relevant. Experiments performed on a multimodal dataset demonstrate the effectiveness and efficiency of our approach. Finally, the proposed framework can be easily extended to involve as many heterogeneous modalities as possible.