C4.5: programs for machine learning
C4.5: programs for machine learning
Affective computing
EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
Computer Music Journal
Towards the automatic detection of involvement in conversation
COST'10 Proceedings of the 2010 international conference on Analysis of Verbal and Nonverbal Communication and Enactment
Hi-index | 0.00 |
This paper introduces a possible approach for evaluating and predicting listeners' emotional engagement during particular musical performances. A set of audio parameters (cues) is extracted from recorded audio files of two contrasting movements from Bach's Solo Violin Sonatas and Partitas and compared to listeners' responses, obtained by moving a slider while listening to music. The cues showing the highest correlations are then used for generating decision trees and a set of rules which will be useful for predicting the emotional engagement (EM) experienced by potential listeners in similar pieces. The model is tested on two different movements of the Solos showing very promising results.