An evaluation of earcons for use in auditory human-computer interfaces
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Technology and perception: the contribution of sensory substitution systems
CT '97 Proceedings of the 2nd International Conference on Cognitive Technology (CT '97)
Tactual Displays for Wearable Computing
ISWC '97 Proceedings of the 1st IEEE International Symposium on Wearable Computers
Tactons: structured tactile messages for non-visual information display
AUIC '04 Proceedings of the fifth conference on Australasian user interface - Volume 28
ACM Transactions on Applied Perception (TAP)
A First Investigation into the Effectiveness of Tactons
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
An empirical comparison of use-in-motion evaluation scenarios for mobile computing devices
International Journal of Human-Computer Studies
Crossmodal icons for information display
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Multidimensional tactons for non-visual information presentation in mobile devices
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Crossmodal spatial location: initial experiments
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Perception of dynamic audiotactile feedback to gesture input
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Crossmodal congruence: the look, feel and sound of touchscreen widgets
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
A collaborative multimodal handwriting training environment for visually impaired students
Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Audio or tactile feedback: which modality when?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploring cues and rhythm for designing multimodal tools to support mobile users in wayfinding
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 1
Expectations for user experience in haptic communication with mobile devices
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Can You Feel It? --- Using Vibration Rhythms to Communicate Information in Mobile Contexts
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Exploring Multimodal Navigation Aids for Mobile Users
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Mapping information to audio and tactile icons
Proceedings of the 2009 international conference on Multimodal interfaces
Crosstrainer: testing the use of multimodal interfaces in situ
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Audio makes a difference in haptic collaborative virtual environments
Interacting with Computers
Leaping across modalities: speed regulation messages in audio and tactile domains
HAID'10 Proceedings of the 5th international conference on Haptic and audio interaction design
Haptic numbers: three haptic representation models for numbers on a touch screen phone
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Ease of juggling: studying the effects of manual multitasking
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Signing on the tactile line: A multimodal system for teaching handwriting to blind children
ACM Transactions on Computer-Human Interaction (TOCHI)
The role of modality in notification performance
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
NaviRadar: a novel tactile information display for pedestrian navigation
Proceedings of the 24th annual ACM symposium on User interface software and technology
Thermal icons: evaluating structured thermal feedback for mobile interaction
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Counting clicks and beeps: Exploring numerosity based haptic and audio PIN entry
Interacting with Computers
Affective quality of audio feedback in different contexts
Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia
Crossmodal Audio and Tactile Interaction with Mobile Touchscreens
International Journal of Mobile Human Computer Interaction
A paradigm shift for mobile interaction: a decade later
CASCON '13 Proceedings of the 2013 Conference of the Center for Advanced Studies on Collaborative Research
Hi-index | 0.00 |
This paper reports an experiment into the design of crossmodal icons which can provide an alternative form of output for mobile devices using audio and tactile modalities to communicate information. A complete set of crossmodal icons was created by encoding three dimensions of information in three crossmodal auditory/tactile parameters. Earcons were used for the audio and Tactons for the tactile crossmodal icons. The experiment investigated absolute identification of audio and tactile crossmodal icons when a user is trained in one modality and tested in the other (and given no training in the other modality) to see if knowledge could be transferred between modalities. We also compared performance when users were static and mobile to see any effects that mobility might have on recognition of the cues. The results showed that if participants were trained in sound with Earcons and then tested with the same messages presented via Tactons they could recognize 85% of messages when stationary and 76% when mobile. When trained with Tactons and tested with Earcons participants could accurately recognize 76.5% of messages when stationary and 71% of messages when mobile. These results suggest that participants can recognize and understand a message in a different modality very effectively. These results will aid designers of mobile displays in creating effective crossmodal cues which require minimal training for users and can provide alternative presentation modalities through which information may be presented if the context requires.