Two-handed gesture in multi-modal natural dialog
UIST '92 Proceedings of the 5th annual ACM symposium on User interface software and technology
Moving objects in space: exploiting proprioception in virtual-environment interaction
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
Vision-based hand-gesture applications
Communications of the ACM
6D hands: markerless hand-tracking for computer aided design
Proceedings of the 24th annual ACM symposium on User interface software and technology
Turn: a virtual pottery by real spinning wheel
ACM SIGGRAPH 2012 Posters
Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor
Proceedings of the 25th annual ACM symposium on User interface software and technology
Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces
Proceedings of the 24th Australian Computer-Human Interaction Conference
Hi-index | 0.00 |
Natural and intuitive interfaces for CAD modeling such as hand gesture controls have received a lot of attention recently. However, in spite of its high intuitiveness and familiarity, their use for actual applications has been found to be less comfortable than a conventional mouse interface because of user physical fatigue over long periods of operation. In this paper, we propose an improved gesture control interface for 3D modeling manipulation tasks that possesses conventional interface level usability with low user fatigue while maintaining a high level of intuitiveness. By analyzing problems associated with previous hand gesture controls in translation, rotation and zooming, we developed a multi-modal control interface GaFinC: Gaze and Finger Control interface. GaFinC can track precise hand positions, recognizes several finger gestures, and utilizes an independent gaze pointing interface for setting the point of interest. To verify the performance of GaFinC, tests of manipulation accuracy and time are conducted and their results are compared with those of a conventional mouse. The comfort and intuitiveness level are also scored by means of user interviews. As a result, although the GaFinC interface posted insufficient performance in accuracy and times compared with a mouse, it shows applicable level performance. Also users found it to be more intuitive than a mouse interface while maintaining a usable level of comfort.