Browsing the structure of multimedia stories
DL '00 Proceedings of the fifth ACM conference on Digital libraries
Computational Media Aesthetics: Finding Meaning Beautiful
IEEE MultiMedia
Affect-based indexing and retrieval of films
Proceedings of the 13th annual ACM international conference on Multimedia
Emotion-based music recommendation by association discovery from film music
Proceedings of the 13th annual ACM international conference on Multimedia
Modeling Intent for Home Video Repurposing
IEEE MultiMedia
Content-based multimedia information retrieval: State of the art and challenges
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Community annotation and remix: a research platform and pilot deployment
Proceedings of the 1st ACM international workshop on Human-centered multimedia
Associating characters with events in films
Proceedings of the 6th ACM international conference on Image and video retrieval
Emotion-based music recommendation by affinity discovery from film music
Expert Systems with Applications: An International Journal
Implicit human-centered tagging
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
SAMMI: semantic affect-enhanced multimedia indexing
SAMT'07 Proceedings of the semantic and digital media technologies 2nd international conference on Semantic Multimedia
Towards multimodal emotion recognition: a new approach
Proceedings of the ACM International Conference on Image and Video Retrieval
Proceedings of the international conference on Multimedia
An affect-based video retrieval system with open vocabulary querying
AMR'10 Proceedings of the 8th international conference on Adaptive Multimedia Retrieval: context, exploration, and fusion
Emotion-based character clustering for managing story-based contents: a cinemetric analysis
Multimedia Tools and Applications
Movie browsing system based on character and emotion
Multimedia Tools and Applications
Hi-index | 0.00 |
We present a method being developed to extract information about characters' emotions in films. It is suggested that this information can help describe higher levels of multimedia semantics relating to narrative structures. Our method extracts information from audio description that is provided for the visually-impaired with an increasing number of films. The method is based on a cognitive theory of emotions that links a character's emotional states to the events in their environment. In this paper the method is described along with some preliminary evaluation and discussions about the kinds of novel video retrieval and browsing applications it may support.