Analysis of scene context related with emotional events
Proceedings of the tenth ACM international conference on Multimedia
Affective content detection using HMMs
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
Feasibility of Personalized Affective Video Summaries
Affect and Emotion in Human-Computer Interaction
An experience oriented video digesting method using heart activity and its applicable video types
PCM'10 Proceedings of the 11th Pacific Rim conference on Advances in multimedia information processing: Part I
The identification of slapstick comedy using automatic affective video indexing techniques
Proceedings of the 50th Annual Southeast Regional Conference
Hi-index | 0.01 |
This research looks into a new direction in multimedia content analysis -the extraction and modeling of the affective content of an arbitrary video. The affective content is viewed as the amount of feeling/emotion contained in and mediated by a video toward a viewer. The ability to automatically extract video content of this nature will lead to a high level of personalization in broadcast delivery to private users, as well as considerably broaden the possibilities of efficiently handling and presenting large amounts of audio-visual data stored in emerging video databases. The technique we have developed uses the so-called "dimensional approach to affect " concept underlined by psychophysiology studies. Our computational method setsto represent the affective content as feature points in the so-called 2D emotion space. We manage to obtain time curves that represent the two affect dimensions (arousal and valence) for a video considered, respectively, from low-level video characteristics. Combining the two time curves results in the so-called affect curve that is regarded as a reliable representation of transitions from one feeling to another along a video, as perceived by a viewer. We illustrate the success of our technique on excerpts taken from an action movie and a typical soccergame, respectively.