Two-handed direct manipulation on the responsive workbench
Proceedings of the 1997 symposium on Interactive 3D graphics
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
A hand gesture interface device
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Measuring the allocation of control in a 6 degree-of-freedom docking experiment
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Visual touchpad: a two-handed gestural input device
Proceedings of the 6th international conference on Multimodal interfaces
Artistic Collaboration in Designing VR Visualizations
IEEE Computer Graphics and Applications
Moving objects with 2D input devices in CAD systems and Desktop Virtual Environments
GI '05 Proceedings of Graphics Interface 2005
Robust computer vision-based detection of pinching for one and two-handed gesture input
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Automated generation of interactive 3D exploded view diagrams
ACM SIGGRAPH 2008 papers
Multiscale 3D reference visualization
Proceedings of the 2009 symposium on Interactive 3D graphics and games
Real-time hand-tracking with a color glove
ACM SIGGRAPH 2009 papers
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Illustrating how mechanical assemblies work
ACM SIGGRAPH 2010 papers
Multi-point interactions with immersive omnidirectional visualizations in a dome
ACM International Conference on Interactive Tabletops and Surfaces
Eden: a professional multitouch tool for constructing virtual organic environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Real-time human pose recognition in parts from single depth images
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
A handle bar metaphor for virtual object manipulation with mid-air interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mockup builder: direct 3D modeling on and above the surface in a continuous interaction space
Proceedings of Graphics Interface 2012
Data-driven finger motion synthesis for gesturing characters
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2012
Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor
Proceedings of the 25th annual ACM symposium on User interface software and technology
Automatic recognition of object size and shape via user-dependent measurements of the grasping hand
International Journal of Human-Computer Studies
Non-parametric hand pose estimation with object context
Image and Vision Computing
Surround-see: enabling peripheral vision on smartphones during active use
Proceedings of the 26th annual ACM symposium on User interface software and technology
Dynamics based 3D skeletal hand tracking
Proceedings of Graphics Interface 2013
Free-hand interaction for handheld augmented reality using an RGB-depth camera
SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
Computer Aided Design (CAD) typically involves tasks such as adjusting the camera perspective and assembling pieces in free space that require specifying 6 degrees of freedom (DOF). The standard approach is to factor these DOFs into 2D subspaces that are mapped to the x and y axes of a mouse. This metaphor is inherently modal because one needs to switch between subspaces, and disconnects the input space from the modeling space. In this paper, we propose a bimanual hand tracking system that provides physically-motivated 6-DOF control for 3D assembly. First, we discuss a set of principles that guide the design of our precise, easy-to-use, and comfortable-to-use system. Based on these guidelines, we describe a 3D input metaphor that supports constraint specification classically used in CAD software, is based on only a few simple gestures, lets users rest their elbows on their desk, and works alongside the keyboard and mouse. Our approach uses two consumer-grade webcams to observe the user's hands. We solve the pose estimation problem with efficient queries of a precomputed database that relates hand silhouettes to their 3D configuration. We demonstrate efficient 3D mechanical assembly of several CAD models using our hand-tracking system.