Metadata-driven interactive web video assembly

  • Authors:
  • Rene Kaiser;Michael Hausenblas;Martin Umgeher

  • Affiliations:
  • Institute of Information Systems & Information Management, Joanneum Research Forschungsgesellschaft mbH, Graz, Austria;Institute of Information Systems & Information Management, Joanneum Research Forschungsgesellschaft mbH, Graz, Austria;Institute for Software Technology, Graz University of Technology, Graz, Austria

  • Venue:
  • Multimedia Tools and Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The recent expansion of broadband Internet access led to an exponential increase of potential consumers of video on the Web. The huge success of video upload websites shows that the online world, with its virtually unlimited possibilities of active user participation, is an ideal complement to traditional consumption-only media like TV and DVD. It is evident that users are willing to interact with content-providing systems in order to get the content they desire. In parallel to these developments, innovative tools for producing interactive, non-linear audio-visual content are being created. They support the authoring process alongside management of media and metadata, enabling on-demand assembly of videos based on the consumer's wishes. The quality of such a dynamic video remixing system mainly depends on the expressiveness of associated metadata. Eliminating the need for manual input as far as possible, we aim at designing a system which is able to automatically enrich its own media and metadata repositories continuously. Currently, video content remixing is available on the Web mostly in very basic forms. Most platforms offer upload and simple modification of content. Although several implementations exist, to the best of our knowledge no solution uses metadata to its full extent to dynamically render a video stream based on consumers' wishes. With the research presented in this paper, we propose a novel concept to interactive video assembly on the Web. In this approach, consumers may describe the desired content using a set of domain-specific parameters. Based on the metadata the video clips are annotated with, the system chooses clips fitting the user criteria. They are aligned in an aesthetically pleasing manner while the user furthermore is able to interactively influence content selection during playback at any time. We use a practical example to clarify the concept and further outline what it takes to implement a suchlike system.