An empirical comparison of pie vs. linear menus
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Toolglass and magic lenses: the see-through interface
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The immediate usability of graffiti
Proceedings of the conference on Graphics interface '97
Two-handed virtual manipulation
ACM Transactions on Computer-Human Interaction (TOCHI)
Manual and cognitive benefits of two-handed input: an experimental study
ACM Transactions on Computer-Human Interaction (TOCHI)
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Large Displays in Automotive Design
IEEE Computer Graphics and Applications
Real-Time Fingertip Tracking and Gesture Recognition
IEEE Computer Graphics and Applications
Visual touchpad: a two-handed gestural input device
Proceedings of the 6th international conference on Multimodal interfaces
SmartCanvas: a gesture-driven intelligent drawing desk system
Proceedings of the 10th international conference on Intelligent user interfaces
Catenaccio: interactive information retrieval system through drawing
Proceedings of the working conference on Advanced visual interfaces
Video-based tracking of user's motion for augmented desk interface
FGR' 04 Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition
Hi-index | 0.00 |
This paper describes a two-handed drawing tool developed on our augmented desk system. Using our real-time finger tracking method, a user can draw and manipulate objects interactively by his/her own finger/hand. Based on the former work on two-handed interaction, different roles are assigned to each hand. The right hand is used to draw and to manipulate objects. Using gesture recognition, primitive objects can be drawn by users' handwriting. On the other hand, the left hand is used to manipulate menus and to assist the right hand. By closing all left hand fingers, users can initiate the appearance of structural radial menus around their left hands, and can select appropriate items by using a left hand finger. The left hand is also used to assist in the performance of drawing tasks, e.g., specifying the center of a circle or top-left corner of a rectangle, or specifying the object to be copied.