Effective sounds in complex systems: the ARKOLA simulation
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing with auditory icons: how well do we identify auditory cues?
CHI '94 Conference Companion on Human Factors in Computing Systems
Using nonspeech sounds to provide navigation cues
ACM Transactions on Computer-Human Interaction (TOCHI)
Earcons and icons: their structure and common design principles
Human-Computer Interaction
The SonicFinder: an interface that uses auditory icons
Human-Computer Interaction
How well do visual verbs work in daily communication for young and old adults?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Speaking through pictures: images vs. icons
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
SLPAT '10 Proceedings of the NAACL HLT 2010 Workshop on Speech and Language Processing for Assistive Technologies
Hi-index | 0.01 |
Auditory displays have been used in both human-machine and computer interfaces. However, the use of non-speech audio in assistive communication for people with language disabilities, or in other applications that employ visual representations, is still under-investigated. In this paper, we introduce SoundNet, a linguistic database that associates natural environmental sounds with words and concepts. A sound labeling study was carried out to verify SoundNet associations and to investigate how well the sounds evoke concepts. A second study was conducted using the verified SoundNet data to explore the power of environmental sounds to convey concepts in sentence contexts, compared with conventional icons and animations. Our results show that sounds can effectively illustrate (especially concrete) concepts and can be applied to assistive interfaces.