The invisible computer
CounterActive: an interactive cookbook for the kitchen counter
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Interacting with groups of computers
Communications of the ACM
Context-aware design and interaction in computer systems
IBM Systems Journal
A visual recipe book for persons with language impairments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Enhancing human-machine interactions: virtual interface alteration through wearable computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Enabling nutrition-aware cooking in a smart kitchen
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Kitchen album: concept based on progressive user research
DPPI '07 Proceedings of the 2007 conference on Designing pleasurable products and interfaces
SuChef: an in-kitchen display to assist with "everyday" cooking
CHI '09 Extended Abstracts on Human Factors in Computing Systems
First-person cooking: a dual-perspective interactive kitchen counter
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Kitchen of the future and applications
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: interaction platforms and techniques
Speech@home: an exploratory study
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Kinect in the kitchen: testing depth camera interactions in practical home environments
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Cooking personas: Goal-directed design requirements in the kitchen
International Journal of Human-Computer Studies
Hi-index | 0.00 |
To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert computer user cook a meal. The user communicates using eye-gaze and speech commands, and eyeCOOK responds visually and/or verbally, promoting communication through natural human input channels without physically encumbering the user. Our goal is to improve productivity and user satisfaction without creating additional requirements for user attention. We describe how the user interacts with the eyeCOOK prototype and the role of this system in an Attentive Kitchen.