A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The role of emotion in believable agents
Communications of the ACM
New technological windows into mind: there is more in eyes and brains for human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
What's in the eyes for attentive input
Communications of the ACM
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
BT Technology Journal
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Intelligent Interactive Entertainment Grand Challenges
IEEE Intelligent Systems
Eye movements as indices for the utility of life-like interface agents: A pilot study
Interacting with Computers
MPML3D: a reactive framework for the multimodal presentation markup language
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect
PIT'06 Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies
Interest estimation based on dynamic bayesian networks for visual attentive presentation agents
Proceedings of the 9th international conference on Multimodal interfaces
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
International Journal of Human-Computer Studies
Hi-index | 0.00 |
We propose an infotainment presentation system that relies on eyegaze as an intuitive and unobtrusive input modality. The system analyzes eye movements in real-time to infer users' attention, visual interest, and preference regarding interface objects. The application consists of a virtual showroom where a team of two highly realistic 3D agents presents product items in an entertaining and attractive way. The presentation flow adapts to the user's attentiveness and interest, or lack thereof, and thus provides a more personalized and user-attentive experience of the presentation.