Content-Based Image Retrieval at the End of the Early Years
IEEE Transactions on Pattern Analysis and Machine Intelligence
Content-Based Audio Classification and Retrieval for Audiovisual Data Parsing
Content-Based Audio Classification and Retrieval for Audiovisual Data Parsing
Extracting information about emotions in films
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
Video information retrieval using objects and ostensive relevance feedback
Proceedings of the 2004 ACM symposium on Applied computing
Successful approaches in the TREC video retrieval evaluations
Proceedings of the 12th annual ACM international conference on Multimedia
Content-based multimedia information retrieval: State of the art and challenges
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Evaluation campaigns and TRECVid
MIR '06 Proceedings of the 8th ACM international workshop on Multimedia information retrieval
WordNet: similarity - measuring the relatedness of concepts
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Using information content to evaluate semantic similarity in a taxonomy
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Lessons for the future from a decade of informedia video analysis research
CIVR'05 Proceedings of the 4th international conference on Image and Video Retrieval
Affective video content representation and modeling
IEEE Transactions on Multimedia
Hi-index | 0.00 |
Content-based video retrieval systems (CBVR) are creating new search and browse capabilities using metadata describing significant features of the data. An often overlooked aspect of human interpretation of multimedia data is the affective dimension. Incorporating affective information into multimedia metadata can potentially enable search using this alternative interpretation of multimedia content. Recent work has described methods to automatically assign affective labels to multimedia data using various approaches. However, the subjective and imprecise nature of affective labels makes it difficult to bridge the semantic gap between system-detected labels and user expression of information requirements in multimedia retrieval. We present a novel affect-based video retrieval system incorporating an open-vocabulary query stage based on WordNet enabling search using an unrestricted query vocabulary. The system performs automatic annotation of video data with labels of well defined affective terms. In retrieval annotated documents are ranked using the standard Okapi retrieval model based on open-vocabulary text queries. We present experimental results examining the behaviour of the system for retrieval of a collection of automatically annotated feature films of different genres. Our results indicate that affective annotation can potentially provide useful augmentation to more traditional objective content description in multimedia retrieval.