An interactive framework for image annotation through gaming

  • Authors:
  • Lasantha Seneviratne;Ebroul Izquierdo

  • Affiliations:
  • Queen Mary University of London, London, United Kingdom;Queen Mary University of London, London, United Kingdom

  • Venue:
  • Proceedings of the international conference on Multimedia information retrieval
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Image indexing is one of the most difficult challenges facing the computer vision community. Addressing this issue, we designed an innovative approach to obtain an accurate label for images by taking into account the social aspects of human-based computation. The proposed approach is highly discriminative in comparison to an ordinary content-based image retrieval (CBIR) paradigm. It aims at what millions of individual gamers are enthusiastic to do, to enjoy themselves within a social competitive environment. It is achieved by setting the focus of the system on the social aspects of the gaming environment, which involves a widely distributed network of human players. Furthermore, this framework integrates a number of different algorithms that are commonly found in image processing and game theoretic approaches to obtain an accurate label. As a result, the framework is able to assign (or derive) accurate tags for images by eliminating annotations made by a less-rational (cheater) player. The performance analysis of this framework has been evaluated with a group of 10 game players. The result shows that the proposed approach is capable of obtaining a good annotation through a small number of game players.