DanVideo: an MPEG-7 authoring and retrieval system for dance videos

  • Authors:
  • Rajkumar Kannan;Frederic Andres;Christian Guetl

  • Affiliations:
  • Department of Computer Science, Bishop Heber College, Tiruchirappalli, India;National Institute of Informatics, Tokyo, Japan;Institute for Information Systems and Computer Media, TU-Graz, Graz, Austria

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

A well-annotated dance media is an essential part of a nation's identity, transcending cultural and language barriers. Many dance video archives suffer from problems concerning authoring and access, because of the complex spatio-temporal relationships that exist between the dancers in terms of movements of their body parts and the emotions expressed by them in a dance. This paper presents a system named DanVideo for semi-automatic authoring and access to dance archives. DanVideo provides methods of annotation and authoring and retrieval tools for choreographers, dancers, and students. We demonstrate how dance media can be semantically annotated and how this information can be used for the retrieval of the dance video semantics. In particular, DanVideo offers an MPEG-7 based semi-automatic authoring tool that takes dance video annotations generated by dance experts and produces MPEG-7 metadata. DanVideo also has a search engine that takes users' queries and retrieves dance semantics from metadata arranged using tree-embedding technique and based on spatial, temporal and spatio-temporal features of dancers. The search engine also leverages a domain-specific ontology to process knowledge-based queries. We have assessed the dance-video queries and semantic annotations in terms of precision, recall, and fidelity.