Gesture VR: vision-based 3D hand interace for spatial interaction
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia
A multi-touch three dimensional touch-sensitive tablet
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Electric Field Sensing For Graphical Interfaces
IEEE Computer Graphics and Applications
Partitioned Sampling, Articulated Objects, and Interface-Quality Hand Tracking
ECCV '00 Proceedings of the 6th European Conference on Computer Vision-Part II
Finger Track - A Robust and Real-Time Gesture Interface
AI '97 Proceedings of the 10th Australian Joint Conference on Artificial Intelligence: Advanced Topics in Artificial Intelligence
Robust classification of hand postures against complex backgrounds
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
DigitEyes: Vision-Based Human Hand Tracking
DigitEyes: Vision-Based Human Hand Tracking
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
Hi-index | 0.01 |
The 3D Sensor Table system senses the movement of bare-hand and recognizes simple hand postures, such as stretched-hand, fist, and knife-shape hand. This system is designed for user interaction with real-time two- and three-dimensional graphics applications. It uses the electric field sensing technique to track bare-hand movements up to 30cm away from the display surface. This paper describes an overview of the system design, implementation, and algorithm for the 3D hand position and posture recognition.