Hands on cooking: towards an attentive kitchen
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Cooking navi: assistant for daily cooking in kitchen
Proceedings of the 13th annual ACM international conference on Multimedia
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Data miming: inferring spatial object descriptions from human gesture
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Detecting cooking state with gas sensors during dry cooking
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
Depth cameras have become a fixture of millions of living rooms thanks to the Microsoft Kinect. Yet to be seen is whether they can succeed as widely in other areas of the home. This research takes the Kinect into real-life kitchens, where touchless gestural control could be a boon for messy hands, but where commands are interspersed with the movements of cooking. We implement a recipe navigator, timer and music player and, experimentally, allow users to change the control scheme at runtime and navigate with other limbs when their hands are full. We tested our system with five subjects who baked a cookie recipe in their own kitchens, and found that placing the Kinect was simple and that subjects felt successful. However, testing in real kitchens underscored the challenge of preventing accidental commands in tasks with sporadic input.