Gesture identification based on zone entry and axis crossing

  • Authors:
  • Ryosuke Aoki;Yutaka Karatsu;Masayuki Ihara;Atsuhiko Maeda;Minoru Kobayashi;Shingo Kagami

  • Affiliations:
  • NTT Cyber Solutions Laboratories, NTT Corporation, Kanagawa, Japan;Graduate School of Media and Governance, Keio University, Kanagawa, Japan;NTT Cyber Solutions Laboratories, NTT Corporation, Kanagawa, Japan;NTT Cyber Solutions Laboratories, NTT Corporation, Kanagawa, Japan;NTT Cyber Solutions Laboratories, NTT Corporation, Kanagawa, Japan;Graduate School of Information Sciences, Tohoku University, Sendai, Japan

  • Venue:
  • HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hand gesture interfaces have been proposed as an alternative to the remote controller, and products with such interfaces have appeared in the market. We propose the vision-based unicursal gesture interface (VUGI) as an extension of our unicursal gesture interface (UGI) for TV remotes with touchpads. Since UGI allows users to select an item on a hierarchical menu comfortably, it is expected that VUGI will yield easy-to-use hierarchical menu selection. Moreover, gestures in the air such as VUGI offer an interface area that is larger than that provided by touchpads. Unfortunately, since the user loses track of his/her finger position, it is not easy to input commands continuously using VUGI. To solve this problem, we propose the dynamic detection zone and the detection axes. An experiment confirms that subjects can input VUGI commands continuously.