Readings in intelligent user interfaces
Readings in intelligent user interfaces
Perceptual user interfaces: multimodal interfaces that process what comes naturally
Communications of the ACM
The representation of multimodal user interface dialogues using discourse pegs
ACL '92 Proceedings of the 30th annual meeting on Association for Computational Linguistics
MATCH: an architecture for multimodal dialogue systems
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Automatic creation of interface specifications from ontologies
SEALTS '03 Proceedings of the HLT-NAACL 2003 workshop on Software engineering and architecture of language technology systems - Volume 8
MULTIPLATFORM testbed: an integration platform for multimodal dialog systems
SEALTS '03 Proceedings of the HLT-NAACL 2003 workshop on Software engineering and architecture of language technology systems - Volume 8
Proceedings of the 6th international conference on Multimodal interfaces
A look under the hood: design and development of the first SmartWeb system demonstrator
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces
HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces
A browser-based multimodal interaction system
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Semi-automatic multimodal user interface generation
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
Intuition as instinctive dialogue
Computing with instinct
Conveying multimodal interaction possibilities through the use of appearances
Create'10 Proceedings of the 2010 international conference on The Interaction Design
A generic formal model for fission of modalities in output multi-modal interactive systems
VECoS'09 Proceedings of the Third international conference on Verification and Evaluation of Computer and Communication Systems
Modeling ontology for multimodal interaction in ubiquitous computing systems
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
SIGDIAL '12 Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue
A Dynamic Spoken Dialogue Interface for Ambient Intelligence Interaction
International Journal of Ambient Computing and Intelligence
Design guidelines for adaptive multimodal mobile input solutions
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.00 |
The development of an intelligent user interface that supports multimodal access to multiple applications is a challenging task. In this paper we present a generic multimodal interface system where the user interacts with an anthropomorphic personalized interface agent using speech and natural gestures. The knowledge-based and uniform approach of SmartKom enables us to realize a comprehensive system that understands imprecise, ambiguous, or incomplete multimodal input and generates coordinated, cohesive, and coherent multimodal presentations for three scenarios, currently addressing more than 50 different functionalities of 14 applications. We demonstrate the main ideas in a walk through the main processing steps from modality fusion to modality fission.