Affective computing
A generate and sense approach to automated music composition
Proceedings of the 9th international conference on Intelligent user interfaces
Interactive control of music using emotional body expressions
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Towards Affective-Psychophysiological Foundations for Music Production
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
In the Mood: Tagging Music with Affects
Affect and Emotion in Human-Computer Interaction
A musical system for emotional expression
Knowledge-Based Systems
Minimalist approach to show emotions via a flock of smileys
Journal of Network and Computer Applications
Towards an automatic music arrangement framework using score reduction
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Hi-index | 0.00 |
This paper describes a real-time music-arranging system that reacts to immediate affective cues from a listener. Data was collected on the potential of certain musical dimensions to elicit change in a listener's affective state using sound files created explicitly for the experiment through composition/production, segmentation, and re-assembly of music along these dimensions. Based on listener data, a probabilistic state transition model was developed to infer the listener's current affective state. A second model was made that would select music segments and re-arrange ('re-mix') them to induce a target affective state. We propose that this approach provides a new perspective for characterizing musical preference.