DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
Statistical color models with application to skin detection
International Journal of Computer Vision
Visual tracking of bare fingers for interactive surfaces
Proceedings of the 17th annual ACM symposium on User interface software and technology
Low-cost multi-touch sensing through frustrated total internal reflection
Proceedings of the 18th annual ACM symposium on User interface software and technology
Color Based Hand and Finger Detection Technology for User Interaction
ICHIT '08 Proceedings of the 2008 International Conference on Convergence and Hybrid Information Technology
Multi-finger interactions with papers on augmented tabletops
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Bare hand interaction in tabletop augmented reality
SIGGRAPH '09: Posters
User-calibration-free remote gaze estimation system
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Vision-based projected tabletop interface for finger interactions
HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
Human activity analysis: A review
ACM Computing Surveys (CSUR)
3D-position estimation for hand gesture interface using a single camera
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
SIM - Smart Interactive Map with Pointing Gestures
IHMSC '12 Proceedings of the 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics - Volume 02
Hi-index | 0.00 |
Pointing is a common gesture of human. Indeed, people tend to involve pointing action in their daily activities using not only bare hand but also with gloves or a kind of pointers like pens, rulers, long sticks, batons, etc. Thus, in this article, the authors propose a new concept of interaction that is centered by the natural gesture of human and a method to detect it under various circumstances. Different from some common approaches which rely on predefined skin color or markers, the proposed method can segment and detect any pointer tip and allow multiple objects to be processed at a time. The method can run with the average accuracy of 91.0%. In case of multiple users using different pointing objects, the accuracy is slightly reduced to 87.9%. The running time is at most 17.14 ms for 9 objects being processed in parallel, and thus can be applied for real time constraints.