A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spatiotemporal sensitivity and visual attention for efficient rendering of dynamic environments
ACM Transactions on Graphics (TOG)
Selective quality rendering by exploiting human inattentional blindness: looking but not seeing
VRST '02 Proceedings of the ACM symposium on Virtual reality software and technology
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Detail to attention: exploiting visual tasks for selective rendering
EGRW '03 Proceedings of the 14th Eurographics workshop on Rendering
Visual attention in 3D video games
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
Perceptual rendering of participating media
ACM Transactions on Applied Perception (TAP)
Applying computational tools to predict gaze direction in interactive visual environments
ACM Transactions on Applied Perception (TAP)
Real-time tracking of visually attended objects in interactive virtual environments
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
On uniform resampling and gaze analysis of bidirectional texture functions
ACM Transactions on Applied Perception (TAP)
A schema-based selective rendering framework
Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization
Investigating the effect of real-time stylisation techniques on user task performance
Proceedings of the 6th Symposium on Applied Perception in Graphics and Visualization
The whys, how tos, and pitfalls of user studies
ACM SIGGRAPH 2009 Courses
Perceptually-motivated graphics, visualization and 3D displays
ACM SIGGRAPH 2010 Courses
An empirical pipeline to derive gaze prediction heuristics for 3D action games
ACM Transactions on Applied Perception (TAP)
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
USAB'10 Proceedings of the 6th international conference on HCI in work and learning, life and leisure: workgroup human-computer interaction and usability engineering
Towards adaptive virtual camera control in computer games
SG'11 Proceedings of the 11th international conference on Smart graphics
Perception in graphics, visualization, virtual environments and animation
SIGGRAPH Asia 2011 Courses
Modelling virtual camera behaviour through player gaze
Proceedings of the 6th International Conference on Foundations of Digital Games
Analysis of human gaze interactions with texture and shape
MUSCLE'11 Proceedings of the 2011 international conference on Computational Intelligence for Multimedia Understanding
Hi-index | 0.00 |
Prediction of gaze behavior in gaming environments can be a tremendously useful asset to game designers, enabling them to improve gameplay, selectively increase visual fidelity, and optimize the distribution of computing resources. The use of saliency maps is currently being advocated as the method of choice for predicting visual attention, crucially under the assumption that no specific task is present. This is achieved by analyzing images for low-level features such as motion, contrast, luminance, etc. However, the majority of computer games are designed to be easily understood and pose a task readily apparent to most players. Our psychophysical experiment shows that in a task-oriented context such as gaming, the predictive power of saliency maps at design time can be weak. Thus, we argue that a more involved protocol utilizing eye tracking, as part of the computer game design cycle, can be sufficiently robust to succeed in predicting fixation behavior of players.