Tracking Focus of Attention in Meetings
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
Proceedings of the 5th international conference on Multimodal interfaces
MAGIC: a motion gesture design tool
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Infotainment devices control by eye gaze and gesture recognition fusion
IEEE Transactions on Consumer Electronics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
A mixed reality head-mounted text translation system using eye gaze input
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Gesture recognition is becoming a popular way of interaction, but still suffers of important drawbacks to be integrated in everyday life devices. One of these drawbacks is the activation of the recognition system -- trigger gesture - which is generally tiring and unnatural. In this paper, we propose two natural solutions to easily activate the gesture interaction. The first one requires a single action from the user: grasping a remote control to start interacting. The second one is completely transparent for the user: the gesture system is only activated when the user's gaze points to the screen, i.e. when s/he is looking at it. Our first evaluation with the 2 proposed solutions plus a default implementation suggests that the gaze estimation activation is efficient enough to remove the need of a trigger gesture in order to activate the recognition system.