Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
Survey of the state of the art in human language technology
Survey of the state of the art in human language technology
Mutual disambiguation of recognition errors in a multimodel architecture
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Mobile computing in the retail arena
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The connected user interface: realizing a personal situated navigation service
Proceedings of the 9th international conference on Intelligent user interfaces
Demonstrating information in simple gestures
Proceedings of the 9th international conference on Intelligent user interfaces
Multimodal interaction under exerted conditions in a natural field setting
Proceedings of the 6th international conference on Multimodal interfaces
Product Associated Displays in a Shopping Scenario
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
Modality preferences in mobile and instrumented environments
Proceedings of the 11th international conference on Intelligent user interfaces
Modelling personality in voices of talking products through prosodic parameters
Proceedings of the 12th international conference on Intelligent user interfaces
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Component-based development of mobile assistants with the ELEPHANT system
Mobility '09 Proceedings of the 6th International Conference on Mobile Technology, Application & Systems
The adaptive web
A successful field test of a mobile and multilingual information service system COMPASS2008
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: applications and services
The digital sommelier: interacting with intelligent products
IOT'08 Proceedings of the 1st international conference on The internet of things
Risks of using AP locations discovered through war driving
PERVASIVE'06 Proceedings of the 4th international conference on Pervasive Computing
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
Parallel presentations for heterogenous user groups – an initial user study
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
Multiplayer gaming with mobile phones: enhancing user experience with a public screen
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
COMPASS2008: the smart dining service
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
To frame or not to frame: the role and design of frameless displays in ubiquitous applications
UbiComp'05 Proceedings of the 7th international conference on Ubiquitous Computing
Using intelligent natural user interfaces to support sales conversations
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Design guidelines for adaptive multimodal mobile input solutions
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Towards the counter free store: requirements for mobile sales assistants
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
Hi-index | 0.00 |
Accompanying the rise of mobile and pervasive computing, applications now need to adapt to their surrounding environments and provide users with information in the environment in an easy and natural manner. In this paper we describe a user interface that integrates multimodal input on a handheld device with external gestures performed with real world artifacts. The described approach extends reference resolution based on speech, handwriting and gesture to that of real world objects that users may hold in their hands. We discuss the varied interaction channels available to users that arise from mixing and matching input modalities on the mobile device with actions performed in the environment. We also discuss the underlying components required in handling these extended multimodal interactions and present an implementation of our ideas in a demonstrator called the Mobile ShopAssist. This demonstrator is then used as the basis for a recent usability study that we describe on user interaction within mobile contexts.