Photobook: content-based manipulation of image databases
International Journal of Computer Vision
VisualSEEk: a fully automated content-based image query system
MULTIMEDIA '96 Proceedings of the fourth ACM international conference on Multimedia
Communications of the ACM
Computer Methods for Mathematical Computations
Computer Methods for Mathematical Computations
Query by Visual Example - Content based Image Retrieval
EDBT '92 Proceedings of the 3rd International Conference on Extending Database Technology: Advances in Database Technology
NeTra: a toolbox for navigating large image databases
ICIP '97 Proceedings of the 1997 International Conference on Image Processing (ICIP '97) 3-Volume Set-Volume 1 - Volume 1
Deriving Semantic from Images Based on the Edge Information
Proceedings of the 2006 conference on Information Modelling and Knowledge Bases XVII
Proceedings of the 2009 conference on Information Modelling and Knowledge Bases XX
APCCM '09 Proceedings of the Sixth Asia-Pacific Conference on Conceptual Modeling - Volume 96
Proceedings of the 2011 conference on Information Modelling and Knowledge Bases XXII
Determining trust in media-rich websites using semantic similarity
Multimedia Tools and Applications
Hi-index | 0.00 |
This paper presents an image retrieval method based on visual and semantic similarity computing with a query-context recognition mechanism. The motivation of our work is to solve the problem which can be described as that if only the visual similarity or only the semantic similarity judgment is performed on image retrieval, the retrieval results do not always match the query intentions of users. Our central idea is that similarity computing has to be performed between visual and semantic levels. To understand the relationship between the visual factors and the semantic factors in images, we have performed experimental studies. From our experimental studies, it is found that it is possible to extract semantic factors from the visual factors of images. Furthermore, it is found that users' query intention can be detected from the difference of images in queries. Based on the experimental results, we develop a method to implement both the semantic and visual similarity judgment for image retrieval. In this method, several images are required to be given as the key images in a query for users to indicate their query intentions. Furthermore, an adjusting value is used for users to indicate their query intentions, intending on the visual similarity or the semantic similarity. Both the visual and semantic factors are extracted from the key images and the similarity computation is performed on the extracted factors. The effectiveness of the method is clarified based on our experimental results.