Understanding computers and cognition
Understanding computers and cognition
Techniques for addressing fundamental privacy and disruption tradeoffs in awareness support systems
CSCW '96 Proceedings of the 1996 ACM conference on Computer supported cooperative work
AROMA: abstract representation of presence supporting mutual awareness
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
People presence or room activity supporting peripheral awareness over distance
CHI 98 Cconference Summary on Human Factors in Computing Systems
Nomadic radio: scaleable and contextual notification for wearable audio messaging
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Multimodal Monitoring of Web Servers
IEEE MultiMedia
Siren songs and swan songs debugging with music
Communications of the ACM - A game experience in every application
Tactons: structured tactile messages for non-visual information display
AUIC '04 Proceedings of the fifth conference on Australasian user interface - Volume 28
Musical program auralization: Empirical studies
ACM Transactions on Applied Perception (TAP)
Screen-based musical interfaces as semiotic machines
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Aesthetic Computing (Leonardo Books)
Aesthetic Computing (Leonardo Books)
Auditory icons: using sound in computer interfaces
Human-Computer Interaction
ICAD'98 Proceedings of the 1998 international conference on Auditory Display
Hi-index | 0.00 |
In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroacoustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.