MARSYAS: a framework for audio analysis
Organised Sound
MARSYAS: a framework for audio analysis
Organised Sound
The metasurface: applying natural neighbour interpolation to two-to-many mapping
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Voice-controlled plucked bass guitar through two synthesis techniques
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
The Wahwactor: a voice controlled wah-wah pedal
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
Timbre interfaces using adjectives and adverbs
NIME '06 Proceedings of the 2006 conference on New interfaces for musical expression
Hi-index | 0.00 |
A crucial set of decisions in digital musical instrument design deals with choosing mappings between parameters controlled by the performer and the synthesis algorithms that actually generate sound. Feature-based synthesis offers a way to parameterize audio synthesis in terms of the quantifiable perceptual characteristics, or features, the performer wishes the sound to take on. Techniques for accomplishing such mappings and enabling feature-based synthesis to be performed in real time are discussed. An example is given of how a real-time performance system might be designed to take advantage of feature-based synthesis's ability to provide perceptually meaningful control over a large number of synthesis parameters.