“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tailor: creating custom user interfaces based on gesture
UIST '90 Proceedings of the 3rd annual ACM SIGGRAPH symposium on User interface software and technology
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The reading assistant: eye gaze triggered auditory prompting for reading remediation
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Chinese input with keyboard and eye-tracking: an anatomical study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human discourse: gesture and speech
ACM Transactions on Computer-Human Interaction (TOCHI)
Eye-R, a glasses-mounted eye motion detection interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
What's in the eyes for attentive input
Communications of the ACM
EyeDraw: a system for drawing pictures with the eyes
CHI '04 Extended Abstracts on Human Factors in Computing Systems
ACM SIGGRAPH 2004 Papers
Eyedraw: a system for drawing pictures with eye movements
Assets '04 Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
EyeDraw: enabling children with severe motor impairments to draw with their eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A widget library for gaze-based interaction elements
Proceedings of the 2006 symposium on Eye tracking research & applications
Corneal Imaging System: Environment from Eyes
International Journal of Computer Vision
The catchment feature model: a device for multimodal fusion and a bridge between signal and sense
EURASIP Journal on Applied Signal Processing
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BC(eye): Combining Eye-Gaze Input with Brain-Computer Interaction
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
KIBITZER: a wearable system for eye-gaze-based mobile urban exploration
Proceedings of the 1st Augmented Human International Conference
Mobile gaze-based screen interaction in 3D environments
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Recognition of human's implicit intention based on an eyeball movement pattern analysis
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Citeology: visualizing paper genealogy
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers
ACM Transactions on Computer-Human Interaction (TOCHI)
Hybrid method based on topography for robust detection of iris center and eye corners
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Decision Prediction Using Visual Patterns
Fundamenta Informaticae - To Andrzej Skowron on His 70th Birthday
Hi-index | 0.02 |
There is little dispute that the main channels of intercommunication of people with the world at large are: sight, sound, and touch; and for people with other people: eye-contact, speech, gesture. Advanced human-computer interfaces increasingly implicate speech i/o, and touch or some form of manual input.