Multimodal system processing in mobile environments
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
A hybrid indoor navigation system
Proceedings of the 6th international conference on Intelligent user interfaces
A resource-adaptive mobile navigation system
Proceedings of the 7th international conference on Intelligent user interfaces
AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface
Personal and Ubiquitous Computing
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The efficiency of multimodal interaction for a map-based task
ANLC '00 Proceedings of the sixth conference on Applied natural language processing
Personal and Ubiquitous Computing
When do we interact multimodally?: cognitive load and multimodal communication patterns
Proceedings of the 6th international conference on Multimodal interfaces
Multimodal interaction for pedestrians: an evaluation study
Proceedings of the 10th international conference on Intelligent user interfaces
Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interactive sonification of geo-referenced data
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Interactive Sonification of Choropleth Maps
IEEE MultiMedia
Interactive 3D sonification for the exploration of city maps
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Auditory perceptible landmarks in mobile navigation
Proceedings of the 12th international conference on Intelligent user interfaces
Tactile feedback for mobile interactions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Smartweb: multimodal web services on the road
Proceedings of the 15th international conference on Multimedia
Designing audio and tactile crossmodal icons for mobile devices
Proceedings of the 9th international conference on Multimodal interfaces
Personalised maps in multimodal mobile GIS
International Journal of Web Engineering and Technology
Vibrotactile feedback as an orientation aid for blind users of mobile guides
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Interpreting and acting on mobile awareness cues
Human-Computer Interaction
Exploring cues and rhythm for designing multimodal tools to support mobile users in wayfinding
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
When navigating in real physical environments, as human beings we tend to display systematic or near-systematic errors with distance, direction and other navigation issues. To avoid making these errors, we choose different stratategies to find our way. While there have been a lot of HCI studies of navigation design guidelines for using maps or speech-based or tactile-based guidance in mobile devices, in this paper we introduce an initial study of multimodal navigation design utilising the design practice of episodes of motion originated from urban planning. The implications of designing cues and providing rhythm, as the design guidelines of episodes of motions suggests, are explored in this study with the subjects being pedestrians with wayfinding tasks in an urban area. The main contributions of this paper are in evaluating the design implications in the context of mobile wayfinding tasks and in reflecting the results according to human wayfinding behaviour. It is concluded that by designing predictive clues and rhythm into mobile multimodal navigation applications, we can improve navigation aids for users.