MnM: a Max/MSP mapping toolbox
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Towards a catalog and software library of mapping methods
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
The Final Ritard: On Music, Motion, and Kinematic Models
Computer Music Journal
Semiotics of Sounds Evoking Motions: Categorization and Acoustic Features
Computer Music Modeling and Retrieval. Sense of Sounds
Computer Music Modeling and Retrieval. Sense of Sounds
ACM Transactions on Applied Perception (TAP)
The evocative power of sounds: Conceptual priming between words and nonverbal sounds
Journal of Cognitive Neuroscience
Synthesising timbres and timbre-changes from adjectives/adverbs
EuroGP'06 Proceedings of the 2006 international conference on Applications of Evolutionary Computing
Tools for designing emotional auditory driver-vehicle interfaces
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
PhysioSonic - evaluated movement sonification as auditory feedback in physiotherapy
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
A sound design for acoustic feedback in elite sports
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
Controlling the Perceived Material in an Impact Sound Synthesizer
IEEE Transactions on Audio, Speech, and Language Processing
Immersive 360° mobile video with an emotional perspective
Proceedings of the 2013 ACM international workshop on Immersive media experiences
Hi-index | 0.00 |
This article addresses the question of synthesis and control of sound attributes from a perceptual point of view. We focused on an attribute related to the general concept of motion evoked by sounds. To investigate this concept, we tested 40 monophonic abstract sounds on listeners via a questionnaire and drawings, using a parametrized custom interface. This original procedure, which was defined with synthesis and control perspectives in mind, provides an alternative means of determining intuitive control parameters for synthesizing sounds evoking motion. Results showed that three main shape categories (linear, with regular oscillations, and with circular oscillations) and three types of direction (rising, descending, and horizontal) were distinguished by the listeners. In addition, the subjects were able to perceive the low-frequency oscillations (below 8 Hz) quite accurately. Three size categories (small, medium, and large) and three levels of randomness (none, low amplitude irregularities, and high amplitude irregularities) and speed (constant speed and speeds showing medium and large variations) were also observed in our analyses of the participants' drawings. We further performed a perceptual test to confirm the relevance of the contribution of some variables with synthesized sounds combined with visual trajectories. Based on these results, a general typology of evoked motion was drawn up and an intuitive control strategy was designed, based on a symbolic representation of continuous trajectories (provided by devices such as motion capture systems, pen tablets, etc.). These generic tools could be used in a wide range of applications such as sound design, virtual reality, sonification, and music.