An empirical comparison of pie vs. linear menus
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interacting with paper on the DigitalDesk
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Moving objects in space: exploiting proprioception in virtual-environment interaction
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
Automatic Handwriting Gestures Recognition Using Hidden Markov Models
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Two-handed drawing on augmented desk system
Proceedings of the Working Conference on Advanced Visual Interfaces
A study of hand shape use in tabletop gesture interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Vision-based hand pose estimation: A review
Computer Vision and Image Understanding
A multi-touch surface using multiple cameras
ACIVS'07 Proceedings of the 9th international conference on Advanced concepts for intelligent vision systems
Features extraction from hand images based on new detection operators
Pattern Recognition
Markerless visual fingertip detection for natural mobile device interaction
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
A brief review of vision based hand gesture recognition
CSECS'11/MECHANICS'11 Proceedings of the 10th WSEAS international conference on Circuits, Systems, Electronics, Control & Signal Processing, and Proceedings of the 7th WSEAS international conference on Applied and Theoretical Mechanics
Real-time hand pose estimation using classifiers
ICCVG'12 Proceedings of the 2012 international conference on Computer Vision and Graphics
Eyes-free interaction with free-hand gestures and auditory menus
International Journal of Human-Computer Studies
Hi-index | 0.00 |
This paper describes SmartCanvas, an intelligent desk system that allows a user to perform freehand drawing on a desk or similar surface with gestures. Our system requires one camera and no touch sensors. The key underlying technique is a vision-based method that distinguishes drawing gestures and transitional gestures in real time, avoiding the need for "artificial" gestures to mark the beginning and end of a drawing stroke. The method achieves an average classification accuracy of 92.17%. Pie-shaped menus and a "rotate-to-and-select" approach eliminate the need for a fixed menu display, resulting in an "invisible" interface.