The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Testbed evaluation of virtual environment interaction techniques
Proceedings of the ACM symposium on Virtual reality software and technology
Accelerometer-based gesture control for a design environment
Personal and Ubiquitous Computing
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
A comparison of different input devices for a 3D environment
Proceedings of the 14th European conference on Cognitive ergonomics: invent! explore!
Gaze-based interaction with massively multiplayer on-line games
CHI '09 Extended Abstracts on Human Factors in Computing Systems
The $3 recognizer: simple 3D gesture recognition on mobile devices
Proceedings of the 15th international conference on Intelligent user interfaces
In the Eye of the Beholder: A Survey of Models for Eyes and Gaze
IEEE Transactions on Pattern Analysis and Machine Intelligence
3D user interface combining gaze and hand gestures for large-scale display
CHI '10 Extended Abstracts on Human Factors in Computing Systems
EyePhone: activating mobile phones with your eyes
Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds
Protractor3D: a closed-form solution to rotation-invariant 3D gestures
Proceedings of the 16th international conference on Intelligent user interfaces
Designing gaze-supported multimodal interactions for the exploration of large image collections
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Gaze and voice controlled drawing
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
3D spatial interaction: applications for art, design, and science
ACM SIGGRAPH 2011 Courses
Using the user's point of view for interaction on mobile devices
23rd French Speaking Conference on Human-Computer Interaction
An Entity-Component Model for Extensible Virtual Worlds
IEEE Internet Computing
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Communications of the ACM
Hi-index | 0.02 |
This paper introduces a continuous gaze tracking and non-touch gesture recognition based interaction method for 3D virtual spaces on tablet devices. The user can turn his/her viewpoint, select objects with gaze and grab and manipulate objects with non-touch hand gestures. The interaction method does not require the use of a mouse or a keyboard. We created a test scenario with an object manipulation task and measured the completion times of a combined gaze tracking and non-touch gesture interaction method, with a touch screen only input method. Short interviews were conducted with 13 test subjects and data was gathered through questionnaires. The touch screen method was generally faster than or as fast as the combined gaze and non-touch gesture method. The users thought, however, that gaze tracking was more interesting and showed potential. The gaze tracking would however require more stability to be suitable for use with mobile devices.