Unifying textual and visual cues for content-based image retrieval on the World Wide Web
Computer Vision and Image Understanding - Special issue on content-based access for image and video libraries
A unified framework for semantics and feature based relevance feedback in image retrieval systems
MULTIMEDIA '00 Proceedings of the eighth ACM international conference on Multimedia
Modern Information Retrieval
Visually Searching the Web for Content
IEEE MultiMedia
Image Indexing Using Color Correlograms
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
WebSeer: An Image Search Engine for the World Wide Web
WebSeer: An Image Search Engine for the World Wide Web
The Journal of Machine Learning Research
Web Image Retrieval Re-Ranking with Relevance Model
WI '03 Proceedings of the 2003 IEEE/WIC International Conference on Web Intelligence
Convex Optimization
Hourly analysis of a very large topically categorized web query log
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Hierarchical clustering of WWW image search results using visual, textual and link information
Proceedings of the 12th annual ACM international conference on Multimedia
A bootstrapping framework for annotating and retrieving WWW images
Proceedings of the 12th annual ACM international conference on Multimedia
Designing Novel Image Search Interfaces by Understanding Unique Characteristics and Usage
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Hi-index | 0.00 |
Web image search has been explored and developed in academic as well as commercial areas for over a decade. To measure the similarity between Web images and user queries, most of the existing Web image search systems try to convert an image to textual keywords by analyzing the textual information available (such as surrounding text and image filename) with or without leveraging image visual features (such as color, texture, shape). In this way, the existing systems transform "Web images" to the "query (text)" space so as to compare the relevance of images to the query. In this paper, we present a novel solution to Web image search - similarity space projection (SSP). This algorithm takes images and queries as two heterogeneous object peers, and projects them into a third Euclidean "similarity space" in which their similarity can be directly measured. The rule of projection guarantees that in the new space the relevant images are kept close to the corresponding query and those irrelevant ones are away from it. Experiments on real-world Web image collections showed that the proposed algorithm significantly outperformed traditional information retrieval models (such as vector space model) in the application of image search. Besides Web image search, we demonstrate that this algorithm can also be applied to image annotation scenario, and has promising performance. Thus, this algorithm unifies Web image search and image annotation into same framework.