Modeling visual attention via selective tuning
Artificial Intelligence - Special volume on computer vision
A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the 2008 symposium on Eye tracking research & applications
Spatiotemporal Saliency in Dynamic Scenes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Two-frame motion estimation based on polynomial expansion
SCIA'03 Proceedings of the 13th Scandinavian conference on Image analysis
Infomax Control of Eye Movements
IEEE Transactions on Autonomous Mental Development
Hi-index | 0.00 |
This paper introduces a model of gaze behavior suitable for robotic active vision. Built upon a saliency map taking into account motion saliency, the presented model estimates the dynamics of different eye movements, allowing to switch from fixational movements, to saccades and to smooth pursuit. We investigate the effect of the embodiment of attentive visual selection in a pan-tilt camera system. The constrained physical system is unable to follow the important fluctuations characterizing the maxima of a saliency map and a strategy is required to dynamically select what is worth attending and the behavior, fixation or target pursuing, to adopt. The main contributions of this work are a novel approach toward real time, motion-based saliency computation in video sequences, a dynamic model for gaze prediction from the saliency map, and the embodiment of the modeled dynamics to control active visual sensing.