Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Two-handed direct manipulation on the responsive workbench
Proceedings of the 1997 symposium on Interactive 3D graphics
Gesture VR: vision-based 3D hand interace for spatial interaction
MULTIMEDIA '98 Proceedings of the sixth ACM international conference on Multimedia
Two-handed virtual manipulation
ACM Transactions on Computer-Human Interaction (TOCHI)
Real-Time Input of 3D Pose and Gestures of a User's Hand and Its Applications for HCI
VR '01 Proceedings of the Virtual Reality 2001 Conference (VR'01)
3D User Interfaces: Theory and Practice
3D User Interfaces: Theory and Practice
Multi-finger gestural interaction with 3d volumetric displays
Proceedings of the 17th annual ACM symposium on User interface software and technology
3DUI '06 Proceedings of the 3D User Interfaces
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Scalable Vision-based Gesture Interaction for Cluster-driven High Resolution Display Systems
VR '09 Proceedings of the 2009 IEEE Virtual Reality Conference
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
3D user interface combining gaze and hand gestures for large-scale display
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Multi-point interactions with immersive omnidirectional visualizations in a dome
ACM International Conference on Interactive Tabletops and Surfaces
Mid-air pan-and-zoom on wall-sized displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Constructing virtual 3D models with physical building blocks
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Medusa: a proximity-aware multi-touch tabletop
Proceedings of the 24th annual ACM symposium on User interface software and technology
6D hands: markerless hand-tracking for computer aided design
Proceedings of the 24th annual ACM symposium on User interface software and technology
A soft hand model for physically-based manipulation of virtual objects
VR '11 Proceedings of the 2011 IEEE Virtual Reality Conference
Lightweight palm and finger tracking for real-time 3D gesture control
VR '11 Proceedings of the 2011 IEEE Virtual Reality Conference
Grasping virtual objects with multi-point haptics
VR '11 Proceedings of the 2011 IEEE Virtual Reality Conference
Bimanual gestural interface for virtual environments
VR '11 Proceedings of the 2011 IEEE Virtual Reality Conference
Gaze and gesture based object manipulation in virtual worlds
Proceedings of the 18th ACM symposium on Virtual reality software and technology
Special Section on Touching the 3rd Dimension: 3D selection with freehand gesture
Computers and Graphics
Bimanual spatial haptic interface for assembly tasks
Proceedings of the 1st symposium on Spatial user interaction
BodyAvatar: creating freeform 3D avatars using first-person body gestures
Proceedings of the 26th annual ACM symposium on User interface software and technology
Two handed mid-air gestural HCI: point + command
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
A study on the degrees of freedom in touchless interaction
SIGGRAPH Asia 2013 Technical Briefs
Hi-index | 0.01 |
Commercial 3D scene acquisition systems such as the Microsoft Kinect sensor can reduce the cost barrier of realizing mid-air interaction. However, since it can only sense hand position but not hand orientation robustly, current mid-air interaction methods for 3D virtual object manipulation often require contextual and mode switching to perform translation, rotation, and scaling, thus preventing natural continuous gestural interactions. A novel handle bar metaphor is proposed as an effective visual control metaphor between the user's hand gestures and the corresponding virtual object manipulation operations. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows us to provide precise controllability despite the Kinect sensor's low image resolution. A comprehensive repertoire of 3D manipulation operations is proposed to manipulate single objects, perform fast constrained rotation, and pack/align multiple objects along a line. Three user studies were devised to demonstrate the efficacy and intuitiveness of the proposed interaction techniques on different virtual manipulation scenarios.