The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze-based interaction with public displays using off-the-shelf components
Proceedings of the 12th ACM international conference adjunct papers on Ubiquitous computing - Adjunct
Drop-and-drag: easier drag & drop on large touchscreen displays
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Toward Mobile Eye-Based Human-Computer Interaction
IEEE Pervasive Computing
Mobile gaze-based screen interaction in 3D environments
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Implementing gaze control for peripheral devices
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze interaction in the Post-WIMP world
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Cross-device eye-based interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
Situated public displays and interactive surfaces are becoming ubiquitous in our daily lives. Issues arise with these devices when attempting to interact over a distance or with content that is physically out of reach. In this paper we outline three techniques that combine gaze with manual hand-controlled input to move objects. We demonstrate and discuss how these techniques could be applied to two scenarios involving, (1) a multi-touch surface and (2) a public display and a mobile device.