A framework for rapid development of multimodal interfaces
Proceedings of the 5th international conference on Multimodal interfaces
ICARE software components for rapidly developing multimodal interfaces
Proceedings of the 6th international conference on Multimodal interfaces
reacTIVision: a computer-vision framework for table-based tangible interaction
Proceedings of the 1st international conference on Tangible and embedded interaction
Proceedings of the 2nd international conference on Tangible and embedded interaction
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
Service-oriented autonomic multimodal interaction in a pervasive environment
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
A nested APi structure to simplify cross-device communication
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Fusion in multimodal interactive systems: an HMM-based algorithm for user-induced adaptation
Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems
Toward rapid and iterative development of tangible, collaborative, distributed user interfaces
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
A dialogue system for multimodal human-robot interaction
Proceedings of the 15th ACM on International conference on multimodal interaction
Hi-index | 0.00 |
This article introduces HephaisTK, a toolkit for rapid prototyping of multimodal interfaces. After briefly discussing the state of the art, the architecture traits of the toolkit are displayed, along with the major features of HephaisTK: agent-based architecture, ability to plug in easily new input recognizers, fusion engine and configuration by means of a SMUIML XML file. Finally, applications created with the HephaisTK toolkit are discussed.