Using automatic facial expression classification for contents indexing based on the emotional component

  • Authors:
  • Uwe Kowalik;Terumasa Aoki;Hiroshi Yasuda

  • Affiliations:
  • Research Center of Advanced Science and Technology, The University of Tokyo, Tokyo, Japan;Research Center of Advanced Science and Technology, The University of Tokyo, Tokyo, Japan;Research Center of Advanced Science and Technology, The University of Tokyo, Tokyo, Japan

  • Venue:
  • EUC'06 Proceedings of the 2006 international conference on Embedded and Ubiquitous Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.02

Visualization

Abstract

Within the last decade the development of new technologies in the multimedia sector has advanced with stunning pace. Due to the availability of high-capacity mass storage devices at low cost private multimedia libraries containing digital video and audio items have recently gained popularity. Although attached meta-data like title, actor's/actress' name and creation time eases the task of finding preferred contents, it is still difficult to find a specific part within a movie one enjoyed before by remembering the time code. In this paper we introduce the BROAFERENCE system that provides a solution for the above problem. We propose meta-data creation based on recorded user experience derived from facial expressions containing joy, sadness and anger events as well as interest focus data. In the following the system layout, functionality and conducted experiments for system verification will be introduced to the reader