Finger Tracking for the Digital Desk
AUIC '00 Proceedings of the First Australasian User Interface Conference
Robust analysis of feature spaces: color image segmentation
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Color-Based Hands Tracking System for Sign Language Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
A Virtual 3D Blackboard: 3D Finger Tracking Using a Single Camera
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Segmenting Hands of Arbitrary Color
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Robust Finger Tracking with Multiple Cameras
RATFG-RTS '99 Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems
Smart Sight: A Tourist Assistant System
ISWC '99 Proceedings of the 3rd IEEE International Symposium on Wearable Computers
Browsing the environment with the SNAP&TELL wearable computer system
Personal and Ubiquitous Computing
Virtualisation d'interfaces matérielles par l'intermédiaire d'un ordinateur porté
IHM 2005 Proceedings of the 17th international conference on Francophone sur l'Interaction Homme-Machine
Hand as natural man-machine interface in smart environments
SPPRA'06 Proceedings of the 24th IASTED international conference on Signal processing, pattern recognition, and applications
Hi-index | 0.00 |
Key to the design of human-machine gesture interface applications is the ability of the machine to quickly and efficiently identify and track the hand movements of its user. In a wearable computer system equipped with head-mounted cameras, this task is extremely difficult due to the uncertain camera motion caused by the user's head movement, the user standing still then randomly walking, and the user's hand or pointing finger abruptly changing directions at variable speeds. This paper presents a tracking methodology based on a robust state-space estimation algorithm, which attempts to control the influence of uncertain environment conditions on the system's performance by adapting the tracking model to compensate for the uncertainties inherent in the data. Our system tracks a user's pointing gesture from a single head mounted camera, to allow the user to encircle an object of interest, thereby coarsely segmenting the object. The snapshot of the object is then passed to a recognition engine for identification, and retrieval of any pre-stored information regarding the object. A comparison of our robust tracker against a plain Kalman tracker showed a 15% improvement in the estimated position error, and exhibited a faster response time.