Visual learning and recognition of 3-D objects from appearance
International Journal of Computer Vision
A survey of computer vision-based human motion capture
Computer Vision and Image Understanding - Modeling people toward vision-based underatanding of a person's shape, appearance, and movement
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Model-based tracking of self-occluding articulated objects
ICCV '95 Proceedings of the Fifth International Conference on Computer Vision
Visual Hand Tracking Using Nonparametric Belief Propagation
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 12 - Volume 12
Analyzing and Capturing Articulated Hand Motion in Image Sequences
IEEE Transactions on Pattern Analysis and Machine Intelligence
Separating Style and Content with Bilinear Models
Neural Computation
The Isometric Self-Organizing Map for 3D Hand Pose Estimation
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Learning a manifold-constrained map between image sets: applications to matching and pose estimation
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
3D People Tracking with Gaussian Process Dynamical Models
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Nonlinear manifold learning for dynamic shape and dynamic appearance
Computer Vision and Image Understanding
Vision-based hand pose estimation: A review
Computer Vision and Image Understanding
Topologically-constrained latent variable models
Proceedings of the 25th international conference on Machine learning
View Point Tracking of Rigid Objects Based on Shape Sub-manifolds
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part III
IEEE Transactions on Pattern Analysis and Machine Intelligence
Real-time hand-tracking with a color glove
ACM SIGGRAPH 2009 papers
Coupled Visual and Kinematic Manifold Models for Tracking
International Journal of Computer Vision
A survey on vision-based human action recognition
Image and Vision Computing
3D human pose from silhouettes by relevance vector regression
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Tracking Hand Rotation and Grasping from an IR Camera Using Cylindrical Manifold Embedding
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Smart particle filtering for 3D hand tracking
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Model-Based 3D Hand Pose Estimation from Monocular Video
IEEE Transactions on Pattern Analysis and Machine Intelligence
Style adaptive contour tracking of human gait using explicit manifold models
Machine Vision and Applications
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
Hi-index | 0.00 |
This paper presents a new approach for tracking hand rotation and various grasping gestures through an infrared camera. For the complexity and ambiguity of an observed hand shape, it is difficult to simultaneously estimate hand configuration and orientation from a silhouette image of a grasping hand gesture. This paper proposes a dynamic shape model for hand grasping gestures using cylindrical manifold embedding to analyze variations of hand shape in different hand configurations between two key hand poses and in simultaneous circular view change by hand rotation. An arbitrary hand shape between two key hand poses from any view can be generated using a cylindrical manifold embedding point after learning nonlinear generative models from the embedding space to the corresponding hand shape observed. The cylindrical manifold embedding model is extended to various grasping gestures by decomposing multiple cylindrical manifold embeddings through grasping style analysis. Grasping hand gestures with simultaneous hand rotation are tracked using particle filters on the manifold space with grasping style estimation. Experimental results for synthetic and real data indicate that the proposed model can accurately track various grasping gestures with hand rotation. The proposed approach may be applied to advanced user interfaces in dark environments by using images beyond the visible spectrum.