Neuronal mechanisms of visual attention
Early vision and beyond
A perceptually based physical error metric for realistic image synthesis
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
IEEE Transactions on Visualization and Computer Graphics
A GPU based saliency map for high-fidelity selective rendering
AFRIGRAPH '06 Proceedings of the 4th international conference on Computer graphics, virtual reality, visualisation and interaction in Africa
Efficient selective rendering of participating media
APGV '06 Proceedings of the 3rd symposium on Applied perception in graphics and visualization
Exploring visual and automatic measures of perceptual fidelity in real and simulated imagery
ACM Transactions on Applied Perception (TAP)
Visual attention detection in video sequences using spatiotemporal cues
MULTIMEDIA '06 Proceedings of the 14th annual ACM international conference on Multimedia
Is accurate occlusion of glossy reflections necessary?
Proceedings of the 4th symposium on Applied perception in graphics and visualization
Visual equivalence: towards a new standard for image fidelity
ACM SIGGRAPH 2007 papers
The influence of shape on the perception of material reflectance
ACM SIGGRAPH 2007 papers
Clone attack! Perception of crowd variety
ACM SIGGRAPH 2008 papers
Perception of complex aggregates
ACM SIGGRAPH 2008 papers
The perception of simulated materials
ACM SIGGRAPH 2008 classes
Depicting procedural caustics in single images
ACM SIGGRAPH Asia 2008 papers
Perceivable artifacts in compressed video and their relation to video quality
Image Communication
Screen-space perceptual rendering of human skin
ACM Transactions on Applied Perception (TAP)
Measuring the perception of light inconsistencies
Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization
A comparative study of image retargeting
ACM SIGGRAPH Asia 2010 papers
Spatiotemporal Visual Considerations for Video Coding
IEEE Transactions on Multimedia
Image quality assessment: from error visibility to structural similarity
IEEE Transactions on Image Processing
Generating stereoscopic HDR images using HDR-LDR image pairs
ACM Transactions on Applied Perception (TAP)
Hi-index | 0.00 |
Motion blur is a frequent requirement for the rendering of high-quality animated images. However, the computational resources involved are usually higher than those for images that have not been temporally antialiased. In this article we study the influence of high-level properties such as object material and speed, shutter time, and antialiasing level. Based on scenes containing variations of these parameters, we design different psychophysical experiments to determine how influential they are in the perception of image quality. This work gives insights on the effects these parameters have and exposes certain situations where motion blurred stimuli may be indistinguishable from a gold standard. As an immediate practical application, images of similar quality can be produced while the computing requirements are reduced. Algorithmic efforts have traditionally been focused on finding new improved methods to alleviate sampling artifacts by steering computation to the most important dimensions of the rendering equation. Concurrently, rendering algorithms can take advantage of certain perceptual limits to simplify and optimize computations. To our knowledge, none of them has identified nor used these limits in the rendering of motion blur. This work can be considered a first step in that direction.