Implicit emotional tagging of multimedia using EEG signals and brain computer interface
WSM '09 Proceedings of the first SIGMM workshop on Social media
Queries and tags in affect-based multimedia retrieval
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Proceedings of the international conference on Multimedia information retrieval
Music video affective understanding using feature importance analysis
Proceedings of the ACM International Conference on Image and Video Retrieval
Implicit image tagging via facial information
Proceedings of the 2nd international workshop on Social signal processing
Correlation-based feature selection and regression
PCM'10 Proceedings of the 11th Pacific Rim conference on Advances in multimedia information processing: Part I
Identification of narrative peaks in video clips: text features perform best
CLEF'09 Proceedings of the 10th international conference on Cross-language evaluation forum: multimedia experiments
Information Processing and Management: an International Journal
Ifelt: accessing movies through our emotions
Proceddings of the 9th international interactive conference on Interactive television
Towards emotional interaction: using movies to automatically learn users' emotional states
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
MovieClouds: content-based overviews and exploratory browsing of movies
Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments
A naive mid-level concept-based fusion approach to violence detection in Hollywood movies
Proceedings of the 3rd ACM conference on International conference on multimedia retrieval
Content-based search overviews and exploratory browsing of movies with MovieClouds
International Journal of Advanced Media and Communication
Learning representations for affective video understanding
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
In this paper, we propose an approach for affective representation of movie scenes based on the emotions that are actually felt by spectators. Such a representation can be used for characterizing the emotional content of video clips for e.g. affective video indexing and retrieval, neuromarketing studies, etc. A dataset of 64 different scenes from eight movies was shown to eight participants. While watching these clips, their physiological responses were recorded. The participants were also asked to self-assess their felt emotional arousal and valence for each scene. In addition, content-based audio- and video-based features were extracted from the movie scenes in order to characterize each one. Degrees of arousal and valence were estimated by a linear combination of features from physiological signals, as well as by a linear combination of content-based features. We showed that a significant correlation exists between arousal/valence provided by the spectator's self-assessments, and affective grades obtained automatically from either physiological responses or from audio-video features. This demonstrates the ability of using multimedia features and physiological responses to predict the expected affect of the user in response to the emotional video content.