MimiCook: a cooking assistant system with situated guidance

  • Authors:
  • Ayaka Sato;Keita Watanabe;Jun Rekimoto

  • Affiliations:
  • The University of Tokyo, Hongo, Bunkyo, Tokyo, Japan;Meiji University, Nakano-ku, Tokyo, Japan;The University of Tokyo, Hongo, Bunkyo, Tokyo, Japan

  • Venue:
  • Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

Referring to documents is common when making things, but there is a difficulty caused by the gap between a written description and the actual context of making. For example, when cooking following a recipe, people may lose their current position in the recipe, misunderstand the required amount of ingredients because of complicated measuring units, or skip steps by mistake. We address these problems by selecting cooking as our domain. Our proposed cooking support system, MimiCook, embodies a recipe in a real kitchen counter and directly navigates a user. The system consists of a computer, a depth camera, a projector, and a scaling device. It displays step-by-step instructions directly onto the utensils and ingredients, and controls the guidance display in accordance with the user's situations. The integrated scaling device also helps users to avoid mistakes with measuring units. Results of our user study shows participants found it easier to cook with the system and even subjects who had never cooked the assigned recipe did not make any mistakes.