XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
Proceedings of the 5th international conference on Multimodal interfaces
GWindows: robust stereo vision for gesture-based control of windows
Proceedings of the 5th international conference on Multimodal interfaces
Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper
Proceedings of the 2001 workshop on Perceptive user interfaces
M/ORIS: a medical/operating room interaction system
Proceedings of the 6th international conference on Multimodal interfaces
A multimodal perceptual user interface for video-surveillance environments
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
A study of manual gesture-based selection for the PEMMI multimodal transport management interface
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
From a wizard of Oz experiment to a real time speech and gesture multimodal interface
Signal Processing - Special section: Multimodal human-computer interfaces
Visual recognition of pointing gestures for human-robot interaction
Image and Vision Computing
Tracking pointing gesture in 3D space for wearable visual interfaces
Proceedings of the international workshop on Human-centered multimedia
Real-time hand posture recognition using range data
Image and Vision Computing
Natural interaction in intelligent spaces: Designing for architecture and entertainment
Multimedia Tools and Applications
Interaction issues in context-aware intelligent environments
Human-Computer Interaction
Real-Time Hand Detection and Gesture Tracking with GMM and Model Adaptation
ISVC '09 Proceedings of the 5th International Symposium on Advances in Visual Computing: Part II
Robust hand gesture analysis and application in gallery browsing
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Estimation of pointing poses for visually instructing mobile robots under real world conditions
Robotics and Autonomous Systems
Sign recognition using constrained optimization
ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part II
Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter
Image and Vision Computing
Hand gesture recognition using depth data
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
3D-tracking of head and hands for pointing gesture recognition in a human-robot interaction scenario
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Humans and smart environments: a novel multimodal interaction approach
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Real time hand tracking as a user input device
KICSS'10 Proceedings of the 5th international conference on Knowledge, information, and creativity support systems
Advanced soft remote control system using hand gesture
MICAI'06 Proceedings of the 5th Mexican international conference on Artificial Intelligence
Robot navigation by eye pointing
ICEC'05 Proceedings of the 4th international conference on Entertainment Computing
Three dimensional fingertip tracking in stereovision
ACIVS'05 Proceedings of the 7th international conference on Advanced Concepts for Intelligent Vision Systems
Tracking body parts of multiple people for multi-person multimodal interface
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
Driver queries using wheel-constrained finger pointing and 3-D head-up display visual feedback
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation
Computer Vision and Image Understanding
Hi-index | 0.00 |
In this paper we describe a real-time system for detecting pointing gestures and estimating the direction of pointing using stereo cameras. Previously, similar systems were implemented using color-based blob trackers, which relied on effective skin color detection; this approach is sensitive to lighting changes and the clothing worn by the user. In contrast, we used a stereo system that produces dense disparity maps in real-time. Disparity maps are considerably less sensitive to lighting changes. Our system subtracts the background, analyzes the foreground pixels to break the body into parts using a robust mixture model, and estimates the direction of pointing. We have tested the system on both coarse and fine pointing by selecting the targets in a room and controlling the cursor on a wall screen, respectively.