The Visual Hull Concept for Silhouette-Based Image Understanding
IEEE Transactions on Pattern Analysis and Machine Intelligence
Human Body Model Acquisition and Tracking Using Voxel Data
International Journal of Computer Vision
Monocular Human Motion Capture with a Mixture of Regressors
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
Human Motion De-noising via Greedy Kernel Principal Component Analysis Filtering
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
3D Skeleton-Based Body Pose Recovery
3DPVT '06 Proceedings of the Third International Symposium on 3D Data Processing, Visualization, and Transmission (3DPVT'06)
A real-time system for full body interaction with virtual worlds
EGVE'04 Proceedings of the Tenth Eurographics conference on Virtual Environments
Towards Real-Time Monocular Video-Based Avatar Animation
ISVC '08 Proceedings of the 4th International Symposium on Advances in Visual Computing, Part II
Integrated Analytic and Linearized Inverse Kinematics for Precise Full Body Interactions
MIG '09 Proceedings of the 2nd International Workshop on Motion in Games
Robust 3D Marker Localization Using Multi-spectrum Sequences
ISVC '09 Proceedings of the 5th International Symposium on Advances in Visual Computing: Part II
Hi-index | 0.00 |
We present an automated system for real-time marker-free motion capture from two calibrated webcams. For fast 3D shape and skin reconstructions, we extend Shape-From-Silhouette algorithms. The motion capture system is based on simple and fast heuristics to increase the efficiency. Multi-modal scheme using both shape and skin-parts analysis, temporal coherence, and human anthropometric constraints are adopted to increase the robustness. Thanks to fast algorithms, low-cost cameras and the fact that the system runs on a single computer, our system is perfectly suitable for home entertainment device. Results on real video sequences demonstrate our approach efficiency.