Exploration and virtual camera control in virtual three dimensional environments
I3D '90 Proceedings of the 1990 symposium on Interactive 3D graphics
Task-sensitive cinematography interfaces for interactive 3D learning environments
IUI '98 Proceedings of the 3rd international conference on Intelligent user interfaces
Virtual 3D camera composition from frame constraints
MULTIMEDIA '00 Proceedings of the eighth ACM international conference on Multimedia
Real-time cinematic camera control for interactive narratives
Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology
Heuristic evaluation for games: usability principles for video game design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A psychophysical study of fixation behavior in a computer game
Proceedings of the 5th symposium on Applied perception in graphics and visualization
A discourse planning approach to cinematic camera control for narratives in virtual environments
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 1
An empirical pipeline to derive gaze prediction heuristics for 3D action games
ACM Transactions on Applied Perception (TAP)
Towards affective camera control in games
User Modeling and User-Adapted Interaction
Declarative camera control for automatic cinematography
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Modelling virtual camera behaviour through player gaze
Proceedings of the 6th International Conference on Foundations of Digital Games
Proceedings of the 4th ACM Multimedia Systems Conference
Bandwidth adaptation for 3D mesh preview streaming
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP) - Special issue of best papers of ACM MMSys 2013 and ACM NOSSDAV 2013
Hi-index | 0.01 |
Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We investigate the relationship between camera placement and playing behaviour in games and build a user model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ machine learning to build predictive models of the virtual camera behaviour. The performance of the models on unseen data reveals accuracies above 70% for all the player behaviour types identified. The characteristics of the generated models, their limits and their use for creating adaptive automatic camera control in games is discussed.