Affective computing
Classification of general audio data for content-based retrieval
Pattern Recognition Letters - Special issue on image/video indexing and retrieval
A robust audio classification and segmentation method
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Wearable and automotive systems for affect recognition from physiology
Wearable and automotive systems for affect recognition from physiology
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
International Journal of Human-Computer Studies - Special issue: Subtle expressivity for characters and robots
Affect-based indexing and retrieval of films
Proceedings of the 13th annual ACM international conference on Multimedia
Multimedia Tools and Applications
How emotion is made and measured
International Journal of Human-Computer Studies
Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals
MRCS'06 Proceedings of the 2006 international conference on Multimedia Content Representation, Classification and Security
Affective video content representation and modeling
IEEE Transactions on Multimedia
On the use of computable features for film classification
IEEE Transactions on Circuits and Systems for Video Technology
Affective understanding in film
IEEE Transactions on Circuits and Systems for Video Technology
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Exploiting facial expressions for affective video summarisation
Proceedings of the ACM International Conference on Image and Video Retrieval
Implicit human-centered tagging
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Is this joke really funny? judging the mirth by audiovisual laughter analysis
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Implicit image tagging via facial information
Proceedings of the 2nd international workshop on Social signal processing
Multimedia Tools and Applications
Information Processing and Management: an International Journal
Biometric response as a source of query independent scoring in lifelog retrieval
ECIR'2010 Proceedings of the 32nd European conference on Advances in Information Retrieval
Hi-index | 0.00 |
In this paper, we propose an approach for affective ranking of movie scenes based on the emotions that are actually felt by spectators. Such a ranking can be used for characterizing the affective, or emotional, content of video clips. The ranking can for instance help determine which video clip from a database elicits, for a given user, the most joy. This in turn will permit video indexing and retrieval based on affective criteria corresponding to a personalized user affective profile. A dataset of 64 different scenes from 8 movies was shown to eight participants. While watching, their physiological responses were recorded; namely, five peripheral physiological signals (GSR - galvanic skin resistance, EMG - electromyograms, blood pressure, respiration pattern, skin temperature) were acquired. After watching each scene, the participants were asked to self-assess their felt arousal and valence for that scene. In addition, movie scenes were analyzed in order to characterize each with various audio- and video-based features capturing the key elements of the events occurring within that scene. Arousal and valence levels were estimated by a linear combination of features from physiological signals, as well as by a linear combination of content-based audio and video features. We show that a correlation exists between arousal- and valence-based rankings provided by the spectator's self-assessments, and rankings obtained automatically from either physiological signals or audio-video features. This demonstrates the ability of using physiological responses of participants to characterize video scenes and to rank them according to their emotional content. This further shows that audio-visual features, either individually or combined, can fairly reliably be used to predict the spectator's felt emotion for a given scene. The results also confirm that participants exhibit different affective responses to movie scenes, which emphasizes the need for the emotional profiles to be user-dependant.