VLSI models for sound synthesis
Current directions in computer music research
The appeal of parallel distributed processing
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Electronic music: new ways to play
IEEE Spectrum
Scanning physical interaction behavior of 3D objects
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
A practical model for subsurface light transport
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Synthesizing sounds from rigid-body simulations
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Principles for designing computer music controllers
NIME '01 Proceedings of the 2001 conference on New interfaces for musical expression
Input devices for musical expression: borrowing tools from HCI
NIME '01 Proceedings of the 2001 conference on New interfaces for musical expression
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Presence: Teleoperators and Virtual Environments
IEEE Transactions on Neural Networks
From physics to sound: Comments on van den Doel, ICAD 2004
ACM Transactions on Applied Perception (TAP)
Controlling a physical model with a 2D force matrix
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Hi-index | 0.01 |
We describe the design and implementation of an adaptive system to map control parameters to modal audio synthesis parameters in real-time. The modal parameters describe the linear response of a virtual vibrating solid, which is played as a musical instrument by a separate interface. The system uses a three layer feedforward backpropagation neural network which is trained by a discrete set of input-output examples. After training, the network extends the training set, which functions as the specification by example of the controller, to a continuous mapping allowing the real-time morphing of synthetic sound models.We have implemented a prototype application using a controller which collects data from a hand-drawn digital picture. The virtual instrument consists of a bank of modal resonators whose frequencies, dampings, and gains are the parameters we control. We train the system by providing pictorial representations of physical objects such as a bell or a lamp, and associate high quality modal models obtained from measurements on real objects with these inputs. After training, the user can draw pictures interactively and "play" modal models which provide interesting (though unrealistic) interpolations of the models from the training set in real-time.