Hand as natural man-machine interface in smart environments

  • Authors:
  • Wei Xie;Eam Khwang Teoh;Ronda Venkateswarlu;Xiang Chen

  • Affiliations:
  • Institute for Infocomm Research, Terrace, Singapore;Nanyang Technological University, Singapore;Institute for Infocomm Research, Terrace, Singapore;Institute for Infocomm Research, Terrace, Singapore

  • Venue:
  • SPPRA'06 Proceedings of the 24th IASTED international conference on Signal processing, pattern recognition, and applications
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In near future the computers, the vision/ sense /speech sensors, intelligent adaptive wireless networks, etc. would disappear into the environment creating smart spaces with embedded information around us. But human is the prime user of that embedded information. As a part of human interaction with embedded machines, we need human like natural interfaces. In this paper, we propose a prototype system for a meeting room using an active stereo-vision to create an interactive digital space (embedded smart space) and use hand as a (i) laser pointer, (ii) as virtual pen enabling us to write on the screen remotely, and (iii) as virtual mouse to drag and drop or highlight a particular part on the screen, and (iv) an auxiliary mode to erase what we have written on the screen. Besides robust tracking of hand in 3D space, we need reliable algorithm to robustly recognize different modes/gestures of hand to perform different functions. The focus in this paper is to develop a robust algorithm based on fuzzy neural network (FNN) that could distinguish hand as three different modes of operation. Our experiments have successfully demonstrated the above three modes in real time using an active stereo vision system.