Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
A multi-touch three dimensional touch-sensitive tablet
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Dynamic real-time deformations using space & time adaptive sampling
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human discourse: gesture and speech
ACM Transactions on Computer-Human Interaction (TOCHI)
Proceedings of the 15th annual ACM symposium on User interface software and technology
Haptic Sculpting of Volumetric Implicit Functions
PG '01 Proceedings of the 9th Pacific Conference on Computer Graphics and Applications
Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays
Proceedings of the 16th annual ACM symposium on User interface software and technology
PlayAnywhere: a compact interactive tabletop projection-vision system
Proceedings of the 18th annual ACM symposium on User interface software and technology
Precise selection techniques for multi-touch screens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A randomized marking scheme for continuous collision detection in simulation of deformable surfaces
Proceedings of the 2006 ACM international conference on Virtual reality continuum and its applications
A Sketch-Based Interface for Clothing Virtual Characters
IEEE Computer Graphics and Applications
ThinSight: versatile multi-touch sensing for thin form-factor displays
Proceedings of the 20th annual ACM symposium on User interface software and technology
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing user interfaces for multi-touch and gesture devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Mouse 2.0: multi-touch meets the mouse
Proceedings of the 22nd annual ACM symposium on User interface software and technology
ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Partial matching of garment panel shapes with dynamic sketching design
Proceedings of the 1st Augmented Human International Conference
A creative try: composing weaving patterns by playing on a multi-input device
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
Hi-index | 0.00 |
Natural and intuitive human-computer interfaces, while striving for immersive experience, often suffer from relatively cumbersome input methods. Mice and keyboards are widely accepted yet limited to point-and-click and basic textual editing operations, distancing the user from the world presented to them. Touch-screen based gestures have introduced a partial solution to this problem. However, they are constrained due to size, weight and positioning and currently provide limited gesture-based operations. In this paper, we propose a gesture model by meta-action formulation based on observations of existing touch gestures as experienced from different perspectives such as type, multiplicity, stage-of-control and control space. We then propose a kernel gesture set which forms the basis of a unified user experience across a large variety of applications and devices. In order to show the effectiveness, intuitiveness and adaptability of the proposed kernel gesture set, a user study is given. We further implement and extend the kernel gesture set through a case study in a very practical domain: garment modeling.