The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
The effect of reducing homing time on the speed of a finger-controlled isometric pointing device
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM SIGCAPH Computers and the Physically Handicapped
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Gazing and frowning as a new human--computer interaction technique
ACM Transactions on Applied Perception (TAP)
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
Fitts' throughput and the speed-accuracy tradeoff
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Fitts' law as a research and design tool in human-computer interaction
Human-Computer Interaction
Wearable EOG goggles: eye-based interaction in everyday environments
CHI '09 Extended Abstracts on Human Factors in Computing Systems
BC(eye): Combining Eye-Gaze Input with Brain-Computer Interaction
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
Proceedings of IEEE Virtual Reality Annual International Symposium
VRAIS '93 Proceedings of the 1993 IEEE Virtual Reality Annual International Symposium
The effect of clicking by smiling on the accuracy of head-mounted gaze tracking
Proceedings of the Symposium on Eye Tracking Research and Applications
Text entry by gazing and smiling
Advances in Human-Computer Interaction
Hi-index | 0.00 |
The present aim was to investigate the functionality of a new wireless prototype called Face Interface. The prototype combines the use of voluntary gaze direction and facial muscle activations, for pointing and selecting objects on a computer screen, respectively. The subjective and objective functionality of the prototype was evaluated with a series of pointing tasks using either frowning (i.e., frowning technique) or raising the eyebrows (i.e., raising technique) as the selection technique. Pointing task times and accuracies were measured using three target diameters (i.e., 25, 30, 40mm), seven pointing distances (i.e., 60, 120, 180, 240, 260, 450, and 520mm), and eight pointing angles (0^o, 45^o, 90^o, 135^o, 180^o, 225^o, 270^o, and 315^o). The results showed that the raising technique was faster selection technique than the frowning technique for the objects that were presented in the pointing distances from 60mm to 260mm. For those pointing distances the overall pointing task times were 2.4s for the frowning technique, and 1.6s for the raising technique. Fitts' law computations showed that the correlations for the Fitts' law model were r=0.77 for the frowning technique and r=0.51 for the raising technique. Further, the index of performance (IP) value was 1.9 bits/s for the frowning technique and 5.4 bits/s for raising the eyebrows technique. Based on the results, the prototype functioned well and was adjustable so that two different facial activations can be used in combination with gaze direction for pointing and selecting objects on a computer screen.