A generic platform for addressing the multimodal challenge
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using code mobility to create ubiquitous and active augmented reality in mobile computing
MobiCom '99 Proceedings of the 5th annual ACM/IEEE international conference on Mobile computing and networking
Perceptual user interfaces: multimodal interfaces that process what comes naturally
Communications of the ACM
Past, present, and future of user interface software tools
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction in the new millennium, Part 1
Taming recognition errors with a multimodal interface
Communications of the ACM
GeoNotes: social enhancement of physical space
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Modelling and Using Sensed Context Information in the Design of Interactive Applications
EHCI '01 Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction
Mobile HCI '02 Proceedings of the 4th International Symposium on Mobile Human-Computer Interaction
Architecture logicielle conceptuelle pour la capture de contexte
IHM '02 Proceedings of the 14th French-speaking conference on Human-computer interaction (Conférence Francophone sur l'Interaction Homme-Machine)
Gaze-orchestrated dynamic windows
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
ARQuake: An Outdoor/Indoor Augmented Reality First Person Application
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
A tactile language for intuitive human-robot communication
Proceedings of the 9th international conference on Multimodal interfaces
Formalization of Multimodal Languages in Pervasive Computing Paradigm
Advanced Internet Based Systems and Applications
A hybrid grammar-based approach to multimodal languages specification
OTM'07 Proceedings of the 2007 OTM confederated international conference on On the move to meaningful internet systems - Volume Part I
Hi-index | 0.00 |
Mobile and ubiquitous systems support multiple interaction techniques such as the synergistic use of active modalities (speech, gesture, etc.) and passive modalities (gaze, localization of a user, etc.). The flexibility they offer results in an increased complexity that current software development tools do not address appropriately. In this paper we describe a component-based approach, called ICARE, for specifying and developing interfaces combining active and passive modalities. Our approach relies on two types of components: (1) elementary components that describe pure modalities (active and passive) and (2) composition components (Complémentarité, Redundancy and Equivalence) that enable the designer to specify combined usage of modalities. The designer graphically assembles the ICARE components and the code of the multimodal user interface is automatically generated. Although the ICARE platform is not fully developed, we illustrate the applicability of the approach with the implementation of three mobiles systems: two GeoNote systems and one prototype of cockpit commands of Rafale (French military plane).