What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pick-and-drop: a direct manipulation technique for multiple computer environments
Proceedings of the 10th annual ACM symposium on User interface software and technology
Digital graffiti: public annotation of multimedia content
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Stitching: pen gestures that span multiple displays
Proceedings of the working conference on Advanced visual interfaces
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Combining gaze with manual interaction to extend physical reach
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A cross-device interaction style for mobiles and surfaces
Proceedings of the Designing Interactive Systems Conference
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Eye-tracking technology is envisaged to become part of our daily life, as its development progresses it becomes more wearable. Additionally there is a wealth of digital content around us, either close to us, on our personal devices or out-of-reach on public displays. The scope of this work aims to combine gaze with mobile input modalities to enable the transfer of content between public and close proximity personal displays. The work contributes enabling technologies, novel interaction techniques, and poses bigger questions that move toward a formalisation of this design space to develop guidelines for the development of future cross-device eye-based interaction methods.