MMM'12 Proceedings of the 18th international conference on Advances in Multimedia Modeling
Harvesting visual concepts for image search with complex queries
Proceedings of the 20th ACM international conference on Multimedia
Proceedings of the 20th ACM international conference on Multimedia
Memory recall based video search: Finding videos you have seen before based on your memory
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Hi-index | 0.00 |
One of the main challenges in interactive concept-based video search is the problem of insufficient relevant samples, especially for queries with complex semantics. In this paper, “related samples” are exploited to enhance interactive video search. The related samples refer to those video segments that are relevant to part of the query rather than the entire query. Compared to the relevant samples which may be rare, the related samples are usually plentiful and easy to find in search results. Generally, the related samples are visually similar and temporally neighboring to the relevant samples. Based on these two characters, we develop a visual ranking model that simultaneously exploits the relevant, related, and irrelevant samples, as well as a temporal ranking model to leverage the temporal relationship between related and relevant samples. An adaptive fusion method is then proposed to optimally explore these two ranking models to generate search results. We conduct extensive experiments on two real-world video datasets: TRECVID 2008 and YouTube datasets. As the experimental results show, our approach achieves at least 96% and 167% performance improvements against the state-of-the-art approaches on the TRECVID 2008 and YouTube datasets, respectively.