Sculpting: an interactive volumetric modeling technique
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
A survey of design issues in spatial input
UIST '94 Proceedings of the 7th annual ACM symposium on User interface software and technology
I3D '95 Proceedings of the 1995 symposium on Interactive 3D graphics
HoloSketch: a virtual reality sketching/animation tool
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on virtual reality software and technology
Attention and visual feedback: the bimanual frame of reference
Proceedings of the 1997 symposium on Interactive 3D graphics
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
3D object modeling using spatial and pictographic gestures
VRST '98 Proceedings of the ACM symposium on Virtual reality software and technology
CavePainting: a fully immersive 3D artistic medium and interactive experience
I3D '01 Proceedings of the 2001 symposium on Interactive 3D graphics
Surface drawing: creating organic 3D shapes with the hand and tangible tools
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM Transactions on Graphics (TOG)
Model-Based Analysis of Hand Posture
IEEE Computer Graphics and Applications
Spatial Information Displays on a Wearable Computer
IEEE Computer Graphics and Applications
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
Twister: a space-warp operator for the two-handed editing of 3D shapes
ACM SIGGRAPH 2003 Papers
ACM SIGGRAPH 2004 Papers
Robust computer vision-based detection of pinching for one and two-handed gesture input
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
A survey of content based 3D shape retrieval methods
Multimedia Tools and Applications
Relaxed selection techniques for querying time-series graphs
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Dialog in the open world: platform and applications
Proceedings of the 2009 international conference on Multimodal interfaces
Spatial sketch: bridging between movement & fabrication
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
Proceedings of the 24th annual ACM symposium on User interface software and technology
Gesture modeling: improving spatial recognition in architectural design process
C&C '11 Proceedings of the 8th ACM conference on Creativity and cognition
LightGuide: projected visualizations for hand movement guidance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Kinect in the kitchen: testing depth camera interactions in practical home environments
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Towards an immersive interface for 3D object retrieval
EG 3DOR'12 Proceedings of the 5th Eurographics conference on 3D Object Retrieval
How we gesture towards machines: an exploratory study of user perceptions of gestural interaction
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Imaginary devices: gesture-based interaction mimicking traditional input devices
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
BodyAvatar: creating freeform 3D avatars using first-person body gestures
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.01 |
Speakers often use hand gestures when talking about or describing physical objects. Such gesture is particularly useful when the speaker is conveying distinctions of shape that are difficult to describe verbally. We present data miming---an approach to making sense of gestures as they are used to describe concrete physical objects. We first observe participants as they use gestures to describe real-world objects to another person. From these observations, we derive the data miming approach, which is based on a voxel representation of the space traced by the speaker's hands over the duration of the gesture. In a final proof-of-concept study, we demonstrate a prototype implementation of matching the input voxel representation to select among a database of known physical objects.