Using while moving: HCI issues in fieldwork environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface
Personal and Ubiquitous Computing
Examining mobile phone text legibility while walking
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Waypoint navigation with a vibrotactile waist belt
ACM Transactions on Applied Perception (TAP)
Changing the pace of search: Supporting “background” information seeking
Journal of the American Society for Information Science and Technology
A role for haptics in mobile interaction: initial design using a handheld tactile display prototype
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing conceptual designs for mobile access to geo-spatial information
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Capturing the effects of context on human performance in mobile computing systems
Personal and Ubiquitous Computing
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Show me the way to Monte Carlo: density-based trajectory navigation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ONTRACK: Dynamically adapting music playback to support navigation
Personal and Ubiquitous Computing
Bearing-based selection in mobile spatial interaction
Personal and Ubiquitous Computing
Obstacle detection and avoidance system for visually impaired people
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
Augmented reality target finding based on tactile cues
Proceedings of the 2009 international conference on Multimodal interfaces
CHI '10 Extended Abstracts on Human Factors in Computing Systems
"I did it my way": moving away from the tyranny of turn-by-turn pedestrian navigation
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
A mobile guide for serendipitous exploration of cities
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
NaviRadar: a novel tactile information display for pedestrian navigation
Proceedings of the 24th annual ACM symposium on User interface software and technology
Navigation your way: from spontaneous independent exploration to dynamic social journeys
Personal and Ubiquitous Computing
Free-hand pointing for identification and interaction with distant objects
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
In this article we describe and evaluate a novel, low interaction cost approach to supporting the spontaneous discovery of geo-tagged information while on the move. Our mobile haptic prototype helps users to explore their environment by providing directional vibrotactile feedback based on the presence of location data. We conducted a study to investigate whether users can find these targets while walking, comparing their performance when using only haptic feedback to that when using an equivalent visual system. The results are encouraging, and here we present our findings, discussing their significance and issues relevant to the design of future systems that combine haptics with location awareness.