Multidimensional audio window management
International Journal of Man-Machine Studies - Computer-supported cooperative work and groupware. part 2
An evaluation of earcons for use in auditory human-computer interfaces
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
3-D sound for virtual reality and multimedia
3-D sound for virtual reality and multimedia
AudioStreamer: exploiting simultaneity for listening
CHI '95 Conference Companion on Human Factors in Computing Systems
Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
Dynamic Soundscape: mapping time to space for audio browsing
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Using nonspeech sounds to provide navigation cues
ACM Transactions on Computer-Human Interaction (TOCHI)
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
WIRE3: Driving Around the Information Super-Highway
Personal and Ubiquitous Computing
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Direkte Manipulation von akustischen Objekten durch blinde Rechnerbenutzer
Software-Ergonomie '95, Mensch - Computer - Interaktion, Anwendungsbereiche lernen voneinander: Gemeinsame Fachtagung des German Chapter of the ACM, der Gesellschaft für Informatik (GI), der TH Darmstadt und GMD/IPSI
Navigating Telephone-Based Interfaces with Earcons
HCI 97 Proceedings of HCI on People and Computers XII
A Menu Interface for Wearable Computing
ISWC '02 Proceedings of the 6th IEEE International Symposium on Wearable Computers
Auditory icons: using sound in computer interfaces
Human-Computer Interaction
The SonicFinder: an interface that uses auditory icons
Human-Computer Interaction
TAM-based success modeling in ERP
Interacting with Computers
Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Multiple spatial sounds in hierarchical menu navigation for visually impaired computer users
International Journal of Human-Computer Studies
Fulfilling mobile information needs: a study on the use of mobile phones
Proceedings of the 5th International Conference on Ubiquitous Information Management and Communication
Auditory messages for speed advice in advanced driver assistance systems
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Meta-Analytical Review of Empirical Mobile Usability Studies
Journal of Usability Studies
Bimodal task-facilitation in a virtual traffic scenario through spatialized sound rendering
ACM Transactions on Applied Perception (TAP)
Advances in Human-Computer Interaction
Hearing is believing: evaluating ambient audio for location-based games
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
Simulator sickness in mobile spatial sound spaces
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
International Journal of Human-Computer Studies
Eyes-free interaction with free-hand gestures and auditory menus
International Journal of Human-Computer Studies
Advanced auditory cues on mobile phones help keep drivers' eyes on the road
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Comparing three novel multimodal touch interfaces for infotainment menus
Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Proceedings of the 19th Brazilian symposium on Multimedia and the web
Hi-index | 0.00 |
This paper describes a user study on interaction with a mobile device installed in a driving simulator. Two new auditory interfaces were proposed and their effectiveness and efficiency were compared to a standard visual interface. Both auditory interfaces consisted of spatialized auditory cues representing individual items in the hierarchical structure of the menu. In the first auditory interface all items of the current level of the menu were played simultaneously. In the second auditory interface only one item was played at a time. The visual interface was shown on a small in-vehicle LCD screen on the dashboard. In all three cases, a custom-made interaction device (a scrolling wheel and two buttons) attached to the steering wheel was used for controlling the interface. The driving performance, task completion times, perceived workload and overall user satisfaction were evaluated. The experiment proved that both auditory interfaces were effective to use in a mobile environment, but were not faster than the visual interface. In the case of shorter tasks, e.g. changing the active profile or deleting an image, the task completion times were comparable for all interfaces; however, both the driving performance was significantly better and the perceived workload was lower when using the auditory interfaces. The test subjects also reported a high overall satisfaction with the auditory interfaces. The latter were labelled as easier to use, more satisfying and more adequate for performing the required tasks than the visual interface. The results of the survey are not surprising as there is a stronger competition for the visual attention between the visual interface and the primary task (driving the car) than in the case of using the auditory interface. So although both types of interfaces were proven to be effective, the visual interface was less efficient as it strongly distracted the user from performing the primary task.