Video scene retrieval with interactive genetic algorithm

  • Authors:
  • Hun-Woo Yoo;Sung-Bae Cho

  • Affiliations:
  • Center for Cognitive Science, Yonsei University, Seoul, Korea 120-749;Department of Computer Science, Yonsei University, Seoul, Korea 120-749

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a video scene retrieval algorithm based on emotion. First, abrupt/gradual shot boundaries are detected in the video clip of representing a specific story. Then, five video features such as "average color histogram," "average brightness," "average edge histogram," "average shot duration," and "gradual change rate" are extracted from each of the videos, and mapping through an interactive genetic algorithm is conducted between these features and the emotional space that a user has in mind. After the proposed algorithm selects the videos that contain the corresponding emotion from the initial population of videos, the feature vectors from them are regarded as chromosomes, and a genetic crossover is applied to those feature vectors. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on a similarity function to obtain the most similar videos as solutions of the next generation. By iterating this process, a new population of videos that a user has in mind are retrieved. In order to show the validity of the proposed method, six example categories of "action," "excitement," "suspense," "quietness," "relaxation," and "happiness" are used as emotions for experiments. This method of retrieval shows 70% of effectiveness on the average over 300 commercial videos.