Evaluation of video artifact perception using event-related potentials

  • Authors:
  • Lea Lindemann;Stephan Wenger;Marcus Magnor

  • Affiliations:
  • TU Braunschweig;TU Braunschweig;TU Braunschweig

  • Venue:
  • Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

When new computer graphics algorithms for image and video editing, rendering or compression are developed, the quality of the results has to be evaluated and compared. Since the produced media are usually to be presented to an audience it is important to predict image and video quality as it would be perceived by a human observer. This can be done by applying some image quality metric or by expensive and time consuming user studies. Typically, statistical image quality metrics do not correlate to quality perceived by a human observer. More sophisticated HVS-inspired algorithms often do not generalize to arbitrary images. A drawback of user studies is that perceived image or video quality is filtered by a decision process, which, in turn, may be influenced by the performed task and chosen quality scale. To get an objective view on (subjectively) perceived image quality, electroencephalography can be used. In this paper we show that artifacts appearing in videos elicit a measurable brain response which can be analyzed using the event-related potentials technique. Since electroencephalography itself requires an elaborate procedure, we aim to find a minimal setup to reduce time and participants needed to conduct a reliable study of image and video quality. As a first step we demonstrate that the reaction to a video with or without an artifact can be identified by an off-the-shelf support vector machine, which is trained on a set of previously recorded responses, with a reliability of up to 80% from a single recorded electroencephalogram.