Towards perceptual quality evaluation of dynamic meshes

  • Authors:
  • Fakhri Torkhani;Kai Wang;Annick Montanvert

  • Affiliations:
  • Gipsa-lab, CNRS UMR, Grenoble, France;Gipsa-lab, CNRS UMR, Grenoble, France;Gipsa-lab, CNRS UMR, Grenoble, France

  • Venue:
  • Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In practical applications, it is common that a 3D mesh undergoes some lossy operations (e.g. simplification, watermarking, compression, noise contamination, etc.). Since the end users of 3D meshes are often human beings, it is thus important to derive metrics that can faithfully assess the perceptual distortions induced by these operations. The derived metrics can be used, for instance, to benchmark a family of geometry processing algorithms, or to guide the design of new algorithms. Like in the case of image quality assessment, metrics based on mesh geometric distances (e.g. Haus-dorff distance and root mean squared error) cannot correctly predict the visual quality degradation. Recently, several perceptually-motivated metrics have been proposed (e.g. mesh structural distortion measure and roughness-based measures) [Lavoué and Corsini 2010]. Those perceptual metrics work well on static meshes, but are less efficient on dynamic meshes because they may mistakenly evaluate "natural" deformations as of rather low perceptual quality (c.f. Section 3). Based on the fact that surface movements in a dynamic mesh sequence are often defined as quasi-isometric deformations (especially in the case of human body and animals animations), we propose in this poster a perceptually-driven mesh quality metric that is capable of distinguishing quasi-isometric deformations from the actual visually unpleasant distortions. To the best of our knowledge, such a metric does not exist in the literature.