Modeling motion visual perception for video quality assessment

  • Authors:
  • Junyong You;Touradj Ebrahimi;Andrew Perkis

  • Affiliations:
  • Norwegian University of Science and Technology, Trondheim, Norway;École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland;Norwegian University of Science and Technology, Trondheim, Norway

  • Venue:
  • MM '11 Proceedings of the 19th ACM international conference on Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Contrast sensitivity of Human Visual System (HVS) plays an important role in perceiving visual stimuli, and consequently, it has a significant impact on the perceived video quality. This paper proposes a visual perception model based on foveated vision and motion perception. The reference and the distorted video sequences are processed by the visual perception model to generate the perceived stimuli in HVS. The perceived difference of the processed sequences is measured in spatial and temporal domains considering the visual sensitivity. An advanced pooling scheme is proposed based on the visual attention mechanism, eye movement type, and influence of temporal quality variation, in order to estimate the perceived video quality. Experimental results demonstrate that the proposed metric significantly outperforms state-of-the-art quality models with respect to a combined eye-tracking and subjective video quality assessment data set.