Statistical color models with application to skin detection
International Journal of Computer Vision
Recent Advances in Augmented Reality
IEEE Computer Graphics and Applications
FingARtips: gesture based direct manipulation in Augmented Reality
Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia
Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Proceedings of the 18th annual ACM symposium on User interface software and technology
Keystroke-level model for advanced mobile phone interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Nearest neighbor search methods for handshape recognition
Proceedings of the 1st international conference on PErvasive Technologies Related to Assistive Environments
One-handed interaction with augmented virtual objects on mobile devices
VRCAI '08 Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Touching the void: direct-touch interaction for intangible displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Markerless visual fingertip detection for natural mobile device interaction
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Free-hand interaction for handheld augmented reality using an RGB-depth camera
SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications
Hi-index | 0.00 |
Over the past few years, Augmented Reality has become widely popular in the form of smart phone applications, however most smart phone-based AR applications are limited in user interaction and do not support gesture-based direct manipulation of the augmented scene. In this paper, we introduce a new AR interaction methodology, employing users' hands and fingers to interact with the virtual (and possibly physical) objects that appear on the mobile phone screen. The goal of this project was to support different types of interaction (selection, transformation, and fine-grain control of an input value) while keeping the methodology for hand detection as simple as possible to maintain good performance on smart phones. We evaluated our methods in user studies, collecting task performance data and user impressions about this direct way of interacting with augmented scenes through mobile phones.