Example-based event retrieval in video archive using rough set theory and video ontology

  • Authors:
  • Kimiaki Shirahama;Kuniaki Uehara

  • Affiliations:
  • Kobe University, Rokkodai, Nada, Kobe, Japan;Kobe University, Rokkodai, Nada, Kobe, Japan

  • Venue:
  • Proceedings of the Tenth International Workshop on Multimedia Data Mining
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop a method for retrieving events of interest in a video archive. To this end, we address the following two issues. First, due to camera techniques, locations and so on, shots of an event contain significantly different features. So, they cannot be retrieved by a single retrieval model. Thus, we use "rough set theory" to extract multiple classification rules, each of which correctly identifies a subset of shots of the event. Second, although concepts like Person, Car and Cityspace are useful for event retrieval, we need to distinguish between relevant concepts to an event and irrelevant ones. Otherwise, the retrieval performance degrades. So, in order to select concepts relevant to the event, we organize concepts into "video ontology" which is a formal and explicit specification of concepts, concept properties and relations among concepts. Experimental results show both the effectiveness of rough set theory and the one of video ontology.