Communications of the ACM - Special issue on graphical user interfaces
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing attentive cell phone using wearable eyecontact sensors
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Evaluating look-to-talk: a gaze-aware interface in a collaborative environment
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Eye-R, a glasses-mounted eye motion detection interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
What's in the eyes for attentive input
Communications of the ACM
Interacting with groups of computers
Communications of the ACM
ECSGlasses and EyePliances: using attention to open sociable windows of interaction
Proceedings of the 2004 symposium on Eye tracking research & applications
Using mental load for managing interruptions in physiologically attentive user interfaces
CHI '04 Extended Abstracts on Human Factors in Computing Systems
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Spotlight: directing users' attention on large displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis
Proceedings of the 18th annual ACM symposium on User interface software and technology
AuraOrb: using social awareness cues in the design of progressive notification appliances
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Eyepatch: prototyping camera-based interaction through examples
Proceedings of the 20th annual ACM symposium on User interface software and technology
Wearable EOG goggles: eye-based interaction in everyday environments
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Mobile gaze-based screen interaction in 3D environments
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
We present EyePliances: appliances and devices that detect and respond to human visual attention using eye contact sensors. EyePliances receive implicit input from users, in the form of eye gaze, and respond by opening communication channels. By allowing devices to recognize the attentional cues people already provide, requests for explicit input from users can be reduced. Further, eye contact sensing gives devices a mechanism to determine whether a user is available for interruption, and can provide the missing environmental context to improve speech recognition.