Hidden Markov models for speech recognition
Technometrics
A taxonomy of adaptive user interfaces
HCI'92 Proceedings of the conference on People and computers VII
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
A framework for rapid development of multimodal interfaces
Proceedings of the 5th international conference on Multimodal interfaces
A probabilistic approach to reference resolution in multimodal user interfaces
Proceedings of the 9th international conference on Intelligent user interfaces
QuickSet: multimodal interaction for simulation set-up and control
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
Unification-based multimodal integration
ACL '98 Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and Eighth Conference of the European Chapter of the Association for Computational Linguistics
Human-centered design meets cognitive load theory: designing interfaces that help people think
MULTIMEDIA '06 Proceedings of the 14th annual ACM international conference on Multimedia
ICASSP '96 Proceedings of the Acoustics, Speech, and Signal Processing, 1996. on Conference Proceedings., 1996 IEEE International Conference - Volume 06
The openinterface framework: a tool for multimodal interaction.
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Multimedia adaptation in ubiquitous environments: benefits of structured multimedia documents
Proceedings of the eighth ACM symposium on Document engineering
Towards an Extended Model of User Interface Adaptation: The Isatine Framework
Engineering Interactive Systems
Squidy: a zoomable design environment for natural user interfaces
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Fusion engines for multimodal input: a survey
Proceedings of the 2009 international conference on Multimodal interfaces
Benchmarking fusion engines of multimodal interactive systems
Proceedings of the 2009 international conference on Multimodal interfaces
HephaisTK: a toolkit for rapid prototyping of multimodal interfaces
Proceedings of the 2009 international conference on Multimodal interfaces
A framework for robust and flexible handling of inputs with uncertainty
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Adaptation in virtual environments: conceptual framework and user models
Multimedia Tools and Applications
Mudra: a unified multimodal interaction framework
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
UCS'04 Proceedings of the Second international conference on Ubiquitous Computing Systems
From members to teams to committee-a robust approach to gestural and multimodal recognition
IEEE Transactions on Neural Networks
Toward rapid and iterative development of tangible, collaborative, distributed user interfaces
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
Hi-index | 0.00 |
Multimodal interfaces have shown to be ideal candidates for interactive systems that adapt to a user either automatically or based on user-defined rules. However, user-based adaptation demands for the corresponding advanced software architectures and algorithms. We present a novel multimodal fusion algorithm for the development of adaptive interactive systems which is based on hidden Markov models (HMM). In order to select relevant modalities at the semantic level, the algorithm is linked to temporal relationship properties. The presented algorithm has been evaluated in three use cases from which we were able to identify the main challenges involved in developing adaptive multimodal interfaces.