A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A user attention model for video summarization
Proceedings of the tenth ACM international conference on Multimedia
Seam carving for content-aware image resizing
ACM SIGGRAPH 2007 papers
Automatic interesting object extraction from images using complementary saliency maps
Proceedings of the international conference on Multimedia
Improved saliency detection based on superpixel clustering and saliency propagation
Proceedings of the international conference on Multimedia
Global contrast based salient region detection
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Unsupervised extraction of visual attention objects in color images
IEEE Transactions on Circuits and Systems for Video Technology
A method for detecting salient regions using integrated features
Proceedings of the 20th ACM international conference on Multimedia
Saliency detection for stereoscopic video
Proceedings of the 4th ACM Multimedia Systems Conference
Optimal contrast based saliency detection
Pattern Recognition Letters
Hi-index | 0.00 |
Image saliency detection provides a powerful tool for predicting where human tends to look at in an image, which has been a long attempt for the computer vision community. In this paper, we propose a biologically-inspired model for computing image saliency. At first, a set of basis functions that accords with visual responses to natural stimuli is learned by using eye-fixation patches from an eye-tracking dataset. Three features are then derived based on the learned basis functions including continuity, clutter contrast, and local contrast. Finally, these three features are combined into the saliency map. The proposed approach is easy to implement and can be used in many image and video content analysis applications. Experiments on a large-scale benchmark dataset and comparisons with a number of the state-of-the-art approaches demonstrate its superiority.