A multiresolution color model for visual difference prediction

  • Authors:
  • David J Tolhurst;Caterina Ripamonti;C. Alejandro Párraga;P. George Lovell;Tom Troscianko

  • Affiliations:
  • University of Cambridge;University of Cambridge;University of Bristol;University of Bristol;University of Bristol

  • Venue:
  • APGV '05 Proceedings of the 2nd symposium on Applied perception in graphics and visualization
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

How different are two images when viewed by a human observer? Such knowledge is needed in many situations including when one has to judge the degree to which a graphics representation may be similar to a high-quality photograph of the original scene. There is a class of computational models which attempt to predict such perceived differences. These are derived from theoretical considerations of human vision and are mostly validated from experiments on stimuli such as sinusoidal gratings. We are developing a model of visual difference prediction based on multi-scale analysis of local contrast, to be tested with psychophysical discrimination experiments on natural-scene stimuli. Here, we extend our model to account for differences in the chromatic domain. We describe the model, how it has been derived and how we attempt to validate it psychophysically for monochrome and chromatic images.