Instructionless learning about a complex device: the paradigm and observations
International Journal of Man-Machine Studies
A comparison of input devices in element pointing and dragging tasks
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Improved efficiency through I- and E-feedback: a trackball with contextual force feedback
International Journal of Human-Computer Studies
Layered protocols: hands-on experience
International Journal of Human-Computer Studies
The Unified Modeling Language reference manual
The Unified Modeling Language reference manual
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Structured Menu Presentation Using Spatial Sound Separation
Mobile HCI '02 Proceedings of the 4th International Symposium on Mobile Human-Computer Interaction
An initial usability assessment for symbolic haptic rendering of music parameters
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
An innovative three-dimensional user interface for exploring music collections enriched
MULTIMEDIA '06 Proceedings of the 14th annual ACM international conference on Multimedia
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Music sharing as a computer supported collaborative application
ECSCW'01 Proceedings of the seventh conference on European Conference on Computer Supported Cooperative Work
Gravity sphere: gestural audio-tactile interface for mobile music exploration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hands on music: physical approach to interaction with digital music
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
A haptic emotional model for audio system interface
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: towards mobile and intelligent interaction environments - Volume Part III
Hi-index | 0.01 |
Design and user evaluation of a multimodal interaction style for music programming is described. User requirements were instant usability and optional use of a visual display. The interaction style consists of a visual roller metaphor. User control of the rollers proceeds by manipulating a force feedback trackball. Tactual and auditory cues strengthen the roller impression and support use without a visual display. The evaluation investigated task performance and procedural learning when performing music programming tasks with and without a visual display. No procedural instructions were provided. Tasks could be completed successfully with and without a visual display, though programming without a display needed more time to complete. Prior experience with a visual display did not improve performance without a visual display. When working without a display, procedures have to be acquired and remembered explicitly, as more procedures were remembered after working without a visual display. It is demonstrated that multimodality provides new ways to interact with music.