Gestures with speech for graphic manipulation
International Journal of Man-Machine Studies
QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
DigitEyes: Vision-Based Human Hand Tracking
DigitEyes: Vision-Based Human Hand Tracking
Detection and Visualization of Anomalous Structures in Molecular Dynamics Simulation Data
VIS '04 Proceedings of the conference on Visualization '04
Recognition-based gesture spotting in video games
Pattern Recognition Letters
APVis '04 Proceedings of the 2004 Australasian symposium on Information Visualisation - Volume 35
Reach the virtual environment: 3D tangible interaction with scientific data
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Artificial Intelligence Review
Adding speech recognition support to UML tools
Journal of Visual Languages and Computing
Immersive molecular visualization and interactive modeling with commodity hardware
ISVC'10 Proceedings of the 6th international conference on Advances in visual computing - Volume Part II
CGLXTouch: A multi-user multi-touch approach for ultra-high-resolution collaborative workspaces
Future Generation Computer Systems
Vision-Based interface for integrated home entertainment system
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Hi-index | 0.03 |
Recent progress in 3D immersive display and virtual reality (VR) technologies has made possible many exciting applications. To fully exploit this potential requires "natural" interfaces that allow manipulating such displays without cumbersome attachments. In this article we describe using visual hand-gesture analysis and speech recognition for developing a speech/gesture interface to control a 3D display. The interface enhances an existing application, VMD, which is a VR visual computing environment for structural biology. The free-hand gestures manipulate the 3D graphical display, together with a set of speech commands. We found