A generic platform for addressing the multimodal challenge
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Perceptual user interfaces (introduction)
Communications of the ACM
Perceptual user interfaces: multimodal interfaces that process what comes naturally
Communications of the ACM
Taming recognition errors with a multimodal interface
Communications of the ACM
GeoNotes: social enhancement of physical space
CHI '01 Extended Abstracts on Human Factors in Computing Systems
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
Communicating with Smart Objects: Developing Technology for Usable Persuasive Computing Systems
Communicating with Smart Objects: Developing Technology for Usable Persuasive Computing Systems
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
ICARE software components for rapidly developing multimodal interfaces
Proceedings of the 6th international conference on Multimodal interfaces
Migratory MultiModal interfaces in MultiDevice environments
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Multimodal interaction on mobile phones: development and evaluation using ACICARE
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
A framework for the intelligent multimodal presentation of information
Signal Processing - Special section: Multimodal human-computer interfaces
The openinterface framework: a tool for multimodal interaction.
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Towards a minimalist multimodal dialogue framework using recursive MVC pattern
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Test-Bed for Multimodal Games on Mobile Devices
Proceedings of the 2nd International Conference on Fun and Games
Formal Testing of Multimodal Interactive Systems
Engineering Interactive Systems
Instrumentation du modèle d'architecture logicielle AMF
IHM '07 Proceedings of the 19th International Conference of the Association Francophone d'Interaction Homme-Machine
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
High level data fusion on a multimodal interactive application platform
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
Fusion engines for multimodal input: a survey
Proceedings of the 2009 international conference on Multimodal interfaces
A fusion framework for multimodal interactive applications
Proceedings of the 2009 international conference on Multimodal interfaces
Proceedings of the 2009 international conference on Multimodal interfaces
Proceedings of the 2009 international conference on Multimodal interfaces
Snap2Play: a mixed-reality game based on scene identification
MMM'08 Proceedings of the 14th international conference on Advances in multimedia modeling
Configurable executable task models supporting the transition from design time to runtime
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: design and development approaches - Volume Part I
Development of multi-modal interfaces in multi-device environments
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Test of the ICARE platform fusion mechanism
DSVIS'05 Proceedings of the 12th international conference on Interactive Systems: design, specification, and verification
i*Chameleon: a platform for developing multimodal application with comprehensive development cycle
Proceedings of the 28th Annual ACM Symposium on Applied Computing
Hi-index | 0.00 |
Multimodal interactive systems support multiple interaction techniques such as the synergistic use of speech, gesture and eye gaze tracking. The flexibility they offer results in an increased complexity that current software development tools do not address appropriately. In this paper we describe a component-based approach, called ICARE, for specifying and developing multimodal interfaces. Our approach relies on two types of components: (i) elementary components that describe pure modalities and (ii) composition components (Complementarity, Redundancy and Equivalence) that enable the designer to specify combined usage of modalities. The designer graphically assembles the ICARE components and the code of the multimodal user interface is automatically generated. Although the ICARE platform is not fully developed, we illustrate the applicability of the approach with the implementation of two multimodal systems: MEMO a GeoNote system and MID, a multimodal identification interface.