Integrating simultaneous input from speech, gaze, and hand gestures
Intelligent multimedia interfaces
Affective computing
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Messages embedded in gaze of interface agents --- impression management with agent's gaze
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
RobotPHONE: RUI for interpersonal communication
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Communications of the ACM
Just blink your eyes: a head-free gaze tracking system
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Real-Time Stereo Tracking for Head Pose and Gaze Estimation
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
BT Technology Journal
Eye gaze tracking techniques for interactive applications
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Estimating the eye gaze from one eye
Computer Vision and Image Understanding - Special issue on eye detection and tracking
A model of attention and interest using Gaze behavior
Lecture Notes in Computer Science
Speech-augmented eye gaze interaction with small closely spaced targets
Proceedings of the 2006 symposium on Eye tracking research & applications
Affective expression in appearance constrained robots
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
ICASSP '01 Proceedings of the Acoustics, Speech, and Signal Processing, 2001. on IEEE International Conference - Volume 05
Proceedings of the 9th international conference on Multimodal interfaces
Integrated speech and gaze control for realistic desktop environments
Proceedings of the 2008 symposium on Eye tracking research & applications
Proceedings of the 2008 symposium on Eye tracking research & applications
The design of gaze behavior for embodied social interfaces
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Context-Aware Computing Applications
WMCSA '94 Proceedings of the 1994 First Workshop on Mobile Computing Systems and Applications
Proceedings of the 4th ACM International Workshop on Context-Awareness for Self-Managing Systems
The 5th ACM international workshop on context-awareness for self-managing systems (CASEMANS 2011)
Proceedings of the 13th international conference on Ubiquitous computing
Estimating a user’s internal state before the first input utterance
Advances in Human-Computer Interaction
Existing challenges and new opportunities in context-aware systems
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Designing engagement-aware agents for multiparty conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: user and context diversity - Volume 2
Hi-index | 0.00 |
This paper proposes a daily-partner robot, that is aware of the user's situation or behavior by using gaze and utterance detection. For appropriate and familiar anthropomorphic interaction, the robot should wait for a timing to talk something to the user corresponding to the situation of her/him while she/he doing a task or thinking. According to the need, our proposed robot i) estimates the user's context by detecting her/his gaze and utterance, such as the target of the user's speech, ii) tries to notify the need to speak to the user by silent (i.e. without making an utterance) gazeturns toward the user and joint attention with taking advantage of the attentiveness, and iii) tell the message when the user talks to the robot. The results of experiments combining subjects' daily tasks with/without the above steps show that the crossmodal-aware behaviors of the robot are important in respectful communications without disturbing the user's ongoing task by adopting silent behaviors showing the robot's intention to speak and for drawing the user's attention.