Generic gesture kernel modeling and its application with virtual garment design

  • Authors:
  • Shuang Liang;Justin Cameron;George Baciu

  • Affiliations:
  • Hong Kong Polytechnic University;Hong Kong Polytechnic University;Hong Kong Polytechnic University

  • Venue:
  • Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Natural and intuitive human-computer interfaces, while striving for immersive experience, often suffer from relatively cumbersome input methods. Mice and keyboards are widely accepted yet limited to point-and-click and basic textual editing operations, distancing the user from the world presented to them. Touch-screen based gestures have introduced a partial solution to this problem. However, they are constrained due to size, weight and positioning and currently provide limited gesture-based operations. In this paper, we propose a gesture model by meta-action formulation based on observations of existing touch gestures as experienced from different perspectives such as type, multiplicity, stage-of-control and control space. We then propose a kernel gesture set which forms the basis of a unified user experience across a large variety of applications and devices. In order to show the effectiveness, intuitiveness and adaptability of the proposed kernel gesture set, a user study is given. We further implement and extend the kernel gesture set through a case study in a very practical domain: garment modeling.