A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
The role of emotion in believable agents
Communications of the ACM
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
What's in the eyes for attentive input
Communications of the ACM
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
Life-Like Characters: Tools, Affective Functions, and Applications (Cognitive Technologies)
BT Technology Journal
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lecture Notes in Computer Science
Eye movements as indices for the utility of life-like interface agents: A pilot study
Interacting with Computers
MPML3D: a reactive framework for the multimodal presentation markup language
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect
PIT'06 Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies
IVA '07 Proceedings of the 7th international conference on Intelligent Virtual Agents
Hi-index | 0.00 |
This research proposes 3D graphical agents in the role of virtual presenters with a new type of functionality --- the capability to process and respond to visual attention of users communicated by their eye movements. Eye gaze is an excellent clue to users' attention, visual interest, and visual preference. Using state-of-the-art non-contact eye tracking technology, eye movements can be assessed in a unobtrusive way. By analyzing and interpreting eye behavior in real-time, our proposed system can adapt to the current (visual) interest state of the user, and thus provide a more personalized, context-aware, and `attentive' experience of the presentation. The system implements a virtual presentation room, where research content of our institute is presented by a team of two highly realistic 3D agents in a dynamic and interactive way. A small preliminary study was conducted to investigate users' gaze behavior with a non-interactive version of the system. A demo video based on our system was awarded as the best application of life-like agents at the GALA event in 2006.