Hands on cooking: towards an attentive kitchen

  • Authors:
  • Jeremy S. Bradbury;Jeffrey S. Shell;Craig B. Knowles

  • Affiliations:
  • Queen's University, Kingston, ON, Canada;Queen's University, Kingston, ON, Canada;Queen's University, Kingston, ON, Canada

  • Venue:
  • CHI '03 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

To make human computer interaction more transparent, different modes of communication need to be explored. We present eyeCOOK, a multimodal attentive cookbook to help a non-expert computer user cook a meal. The user communicates using eye-gaze and speech commands, and eyeCOOK responds visually and/or verbally, promoting communication through natural human input channels without physically encumbering the user. Our goal is to improve productivity and user satisfaction without creating additional requirements for user attention. We describe how the user interacts with the eyeCOOK prototype and the role of this system in an Attentive Kitchen.