Classification of general audio data for content-based retrieval
Pattern Recognition Letters - Special issue on image/video indexing and retrieval
Affect computing in film through sound energy dynamics
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Affective content detection using HMMs
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
DEAP: A Database for Emotion Analysis ;Using Physiological Signals
IEEE Transactions on Affective Computing
Affective video content representation and modeling
IEEE Transactions on Multimedia
On the use of computable features for film classification
IEEE Transactions on Circuits and Systems for Video Technology
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Learning representations for affective video understanding
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
Nowadays, the amount of multimedia contents is explosively increasing and it is often a challenging problem to find a content that will be appealing or matches users' current mood or affective state. In order to achieve this goal, an effcient indexing technique should be developed to annotate multi-media contents such that these annotations can be used in a retrieval process using an appropriate query. One approach to such indexing techniques is to determine the affect(type and intensity), which can be induced in a user while consuming multimedia. In this paper, affective content analysis of music video clips is performed to determine the emotion they can induce in people. To this end, a subjective test was developed, where 32 participants watched different music video clips and assessed their induced emotions. These self assessments were used as ground-truth and the results of classification using audio, visual and audiovisual features extracted from music video clips are presented and compared.