Tracking pointing gesture in 3D space for wearable visual interfaces

  • Authors:
  • Yunde Jia;Shanqing Li;Yang Liu

  • Affiliations:
  • Beijing Institute of Technology, Beijing, China;Beijing Institute of Technology, Beijing, China;Beijing Institute of Technology, Beijing, China

  • Venue:
  • Proceedings of the international workshop on Human-centered multimedia
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a practical method of tracking pointing gesture in 3D space for wearable visual interfaces. We integrate dense depth maps and contour cues to achieve more stable tracking performance. A strategy of fusing information from selective attention maps and synthetical feature maps is presented for locating the focus of attention pointed at by pointing gesture. We have developed a wearable stereo vision system for pointing gesture tracking, called POGEST, with FPGA-based dense depth mapping at video rate. The system enables a wearer to locate and select an object in 3D space using his/her pointing hand. A common focus of attention shared by a wearer and a computer is established for natural human computer interaction. We also discuss an application of locating objects in 3D virtual environments.