CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Using marking menus to develop command sets for computer vision based hand gesture interfaces
Proceedings of the second Nordic conference on Human-computer interaction
The Challenges of Wearable Computing: Part 1
IEEE Micro
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Generic Framework for Transforming Everyday Objects into Interactive Surfaces
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
Mobile Interfaces Using Body Worn Projector and Camera
VMR '09 Proceedings of the 3rd International Conference on Virtual and Mixed Reality: Held as Part of HCI International 2009
GART: the gesture and activity recognition toolkit
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Vision-Based interpretation of hand gestures for remote control of a computer mouse
ECCV'06 Proceedings of the 2006 international conference on Computer Vision in Human-Computer Interaction
Gesture-based user interfaces for public spaces
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: users diversity - Volume Part II
Hi-index | 0.00 |
The Portable Gestural Interface PyGmI, which we implemented, is a smart tool to interact with a system via simple hand gestures. The user wears some color markers on his fingers and a webcam on his chest. The implemented prototype permits to visualize and navigate into presentation files, thanks to a tiny projector fixed on the user's belt. The gesture recognition uses color segmentation, tracking and the Gesture and Activity Recognition Toolkit (GART). This article presents PyGmI, its setup, the designed gestures, the recognition modules, an application using it and finally an evaluation.