A design space for multimodal systems: concurrent processing and data fusion
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
User Centered System Design; New Perspectives on Human-Computer Interaction
User Centered System Design; New Perspectives on Human-Computer Interaction
Visual display, pointing, and natural language: the power of multimodal interaction
AVI '98 Proceedings of the working conference on Advanced visual interfaces
Unification-based multimodal integration
ACL '98 Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and Eighth Conference of the European Chapter of the Association for Computational Linguistics
Multimodal interactive maps: designing for human performance
Human-Computer Interaction
ReferringPhenomena '97 Referring Phenomena in a Multimedia Context and their Computational Treatment
Toward Natural Gesture/Speech Control of a Large Display
EHCI '01 Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction
Hi-index | 0.00 |
Following the ecological approach to visual perception, this paper presents an innovative framework for the design of multimodal systems. The proposal emphasises the role of the visual context on gestural communication. It is aimed at extending the concept of affordances to explain referring gesture variability. The validity of the approach is confirmed by results of a simulation experiment. A discussion of practical implications of our findings for software architecture design is presented.