Speech recognition in noisy environments: a survey
Speech Communication
Designing SpeechActs: issues in speech user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Survey of Hand Posture and Gesture Recognition Techniques and Technology
A Survey of Hand Posture and Gesture Recognition Techniques and Technology
Using metaphors to create a natural user interface for microsoft surface
CHI '10 Extended Abstracts on Human Factors in Computing Systems
A gestural approach to presentation exploiting motion capture metaphors
Proceedings of the International Working Conference on Advanced Visual Interfaces
Involving users in the gestural language definition process for the NInA framework
Proceedings of the 12th Brazilian Symposium on Human Factors in Computing Systems
Magnet-Based Around Device Interaction for Playful Music Composition and Gaming
International Journal of Mobile Human Computer Interaction
Hi-index | 0.00 |
This SIG is a forum to advance an integrated approach to multi-modal Natural User Interfaces. Up until now the research and design of NUI interfaces for various modalities (speech, touch, gesture) has proceeded independently. We propose having an integrated discussion with both academics and practitioners to stimulate the exchange of knowledge about the various modalities and how they might be fruitfully combined, and identifying key areas of future research and design that make the case for multi-modal NUIs. The goal is to not only create a vision of synthetic applications of NUI by connecting researchers but to also discuss ways to make the vision a reality.