Mobile JND: environment adapted perceptual model and mobile video quality enhancement

  • Authors:
  • Jingteng Xue;Chang Wen Chen

  • Affiliations:
  • State University of New York at Buffalo, Buffalo, NY;State University of New York at Buffalo, Buffalo, NY

  • Venue:
  • Proceedings of the 3rd Multimedia Systems Conference
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Design of a quality-of-experience (QoE) optimized mobile video system should consider not only the video content and display specifications but also the fact that mobile devices are exposed to many different environments and viewing scenarios. For same device and same content, the viewer will perceive different visual qualities when the viewing environment changes. Current perceptual quality estimation approaches including the extensively adopted just noticeable distortion (JND) based models neglect significant influence of surroundings on perception. However, the environmental effects on perception have long been supported by psychophysical experiments. This paper proposes a novel viewing scenario adapted model that exploits the influence of various viewing conditions including display size, viewing distance, ambient luminance and body movement and apply the proposed model to the H.264 video encoding. With the help of multiple sensors widely equipped on handholds today, the mobile device is able to dynamically estimate the surrounding conditions. The estimated environment parameters are feedback to video encoder to generate encoded video source that best matches to the current scenario so as to improve the bandwidth efficiency and enhance visual quality for that particular environment. Our subjective experiments demonstrate a significant 30% saving on bit-rates without perceivable quality loss, or obvious improvements in visual qualities under same bandwidth constraint.