Dynamic hand shape manifold embedding and tracking from depth maps

  • Authors:
  • Chan-Su Lee;Sung Yong Chun;Shin Won Park

  • Affiliations:
  • Department of Electronic Engineering, Yeungnam University, Gyeongsan-si, Gyeongsangbook-do, Korea(ROK);Department of Electronic Engineering, Yeungnam University, Gyeongsan-si, Gyeongsangbook-do, Korea(ROK);Department of Electronic Engineering, Yeungnam University, Gyeongsan-si, Gyeongsangbook-do, Korea(ROK)

  • Venue:
  • ACCV'12 Proceedings of the 11th international conference on Computer Vision - Volume 2
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Hand shapes vary for different views or hand rotations. In addition, the high degree of freedom of hand configurations makes it difficult to track hand shape variations. This paper presents a new manifold embedding method that models hand shape variations in different hand configurations and in different views due to hand rotation. Instead of traditional silhouette images, the hand shapes are modeled using depth map images, which provides rich shape information invariant to illumination changes. These depth map images vary for different viewing directions, similar to shape silhouettes. Sample data along view circles are collected for all the hand configuration variations. A new manifold embedding method using a 4D torus for modeling low dimensional hand configuration and hand rotation is proposed to model the product of three circular manifolds. After learning nonlinear mapping from the proposed embedding space to depth map images, we can achieve the tracking of arbitrary shape variations with hand rotation using particle filter on the embedding manifold. The experiment results from both synthetic and real data show accurate estimations of hand rotation through the estimation of the view parameters and hand configuration from key hand poses and hand configuration phases.