The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System
IWAR '99 Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality
Combining gaze with manual interaction to extend physical reach
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Hi-index | 0.01 |
The goal of the project "Gaze Controlled Interaction with Peripheral Devices" was to extend the capability of the head based eye tracking system DIKABLIS to detect the gaze allocation to previously defined Areas of Interest (AOI) in real time. This allows initiating various events or commands when a test person is wearing the head unit and the gaze is detected in an AOI. The commands can be used for interaction with different devices. Thus the tool for monitoring and analyzing gaze behavior becomes an interaction medium. With such a gaze control multi-modal interaction concepts could be realized. The projects primary aim was to give people with tetraplegia a mean of controlling devices in their home. The experimental set-up was a TV set that can be controlled by gaze.