Force and touch feedback for virtual reality
Force and touch feedback for virtual reality
Movies from music: Visualizing musical compositions
SIGGRAPH '79 Proceedings of the 6th annual conference on Computer graphics and interactive techniques
Tactons: structured tactile messages for non-visual information display
AUIC '04 Proceedings of the fifth conference on Australasian user interface - Volume 28
Waypoint navigation with a vibrotactile waist belt
ACM Transactions on Applied Perception (TAP)
Display of virtual braille dots by lateral skin deformation: feasibility study
ACM Transactions on Applied Perception (TAP)
Cutaneous grooves: composing for the sense of touch
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Distal object perception through haptic user interfaces for individuals who are blind
ACM SIGACCESS Accessibility and Computing
Evaluation of spatial displays for navigation without sight
ACM Transactions on Applied Perception (TAP)
A design of cell-based pin-array tactile display
Proceedings of the 2005 international conference on Augmented tele-existence
Emotional remapping of music to facial animation
Proceedings of the 2006 ACM SIGGRAPH symposium on Videogames
A visio-haptic wearable system for assisting individuals who are blind
ACM SIGACCESS Accessibility and Computing
Tactual displays for sensory substitution and wearable computers
SIGGRAPH '05 ACM SIGGRAPH 2005 Courses
Computers in Entertainment (CIE) - Interactive TV
Towards a model human cochlea: sensory substitution for crossmodal audio-tactile displays
GI '08 Proceedings of graphics interface 2008
Hi-index | 0.00 |
In this paper, we describe a Model Human Cochlea (MHC), a sensory substitution technique aimed at translating some of the emotional content expressed in music onto a haptic ambient display. We present some of the issues and challenges encountered in designing the model. This research is situated within the domain of crossmodal displays, with specific focus on enhancing the entertainment experience associated with film audio for users who are deaf or hard of hearing. The interface, in its final form factor, will be integrated into an EmotiChair, a multi-sensory entertainment interface that supports crossmodal audio-haptic display interactions. To assist with the design of the MHC, we have developed a flexible prototype to support research in cross-modal audio-haptic displays. Details of the multidisciplinary design process that has informed the development of the the MHC prototype, and the evaluation methodology adopted to explore the different configurations of the MHC are presented.