Popular music retrieval by detecting mood
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Music emotion classification: a fuzzy approach
MULTIMEDIA '06 Proceedings of the 14th annual ACM international conference on Multimedia
Extraction of Emotional Content from Music Data
CISIM '08 Proceedings of the 2008 7th Computer Information Systems and Industrial Management Applications
Sound Design for Affective Interaction
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Mr. Emo: music retrieval in the emotion plane
MM '08 Proceedings of the 16th ACM international conference on Multimedia
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
An improved valence-arousal emotion space for video affective content representation and recognition
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
A climate of fear: considerations for designing a virtual acoustic ecology of fear
Proceedings of the 6th Audio Mostly Conference: A Conference on Interaction with Sound
Extracting emotions from music data
ISMIS'05 Proceedings of the 15th international conference on Foundations of Intelligent Systems
Automatic mood detection and tracking of music audio signals
IEEE Transactions on Audio, Speech, and Language Processing
International Journal of Human-Computer Studies
Hi-index | 0.00 |
Sound events can carry multiple information, related to the sound source and to ambient environment. However, it is well-known that sound evokes emotions, a fact that is verified by works in the disciplines of Music Emotion Recognition and Music Information Retrieval that focused on the impact of music to emotions. In this work we introduce the concept of affective acoustic ecology that extends the above relation to the general concept of sound events. Towards this aim, we define sound event as a novel audio structure with multiple components. We further investigate the application of existing emotion models employed for music affective analysis to sonic, non-musical, content. The obtained results indicate that although such application is feasible, no significant trends and classification outcomes are observed that would allow the definition of an analytic relation between the technical characteristics of a sound event waveform and raised emotions.