EyeScreen: a gesture interface for manipulating on-screen objects

  • Authors:
  • Shanqing Li;Jingjun Lv;Yihua Xu;Yunde Jia

  • Affiliations:
  • School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China;School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China;School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China;School of Computer Science and Technology, Beijing Institute of Technology, Beijing, China

  • Venue:
  • HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presented a gesture-based interaction system which provides a natural way of manipulating on-screen objects. We generate a synthetic image by linking images from two cameras to recognize hand gestures. The synthetic image contains all the features captured from two different views, which can be used to alleviate the self-occlusion problem and improve the recognition rate. The MDA and EM algorithms are used to obtain parameters for pattern classification. To compute more detailed pose parameters such as fingertip positions and hand contours in the image, a random sampling method is introduced in our system. We describe a method based on projective geometry for background subtraction to improve the system performance. Robustness of the system has been verified by extensive experiments with different user scenarios. The applications of picture browser and visual pilot are discussed in this paper.