Multi-spectral saliency detection

  • Authors:
  • Qi Wang;Pingkun Yan;Yuan Yuan;Xuelong Li

  • Affiliations:
  • Center for OPTical IMagery Analysis and Learning (OPTIMAL), State Key Laboratory of Transient Optics and Photonics, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, ...;Center for OPTical IMagery Analysis and Learning (OPTIMAL), State Key Laboratory of Transient Optics and Photonics, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, ...;Center for OPTical IMagery Analysis and Learning (OPTIMAL), State Key Laboratory of Transient Optics and Photonics, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, ...;Center for OPTical IMagery Analysis and Learning (OPTIMAL), State Key Laboratory of Transient Optics and Photonics, Xi'an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, ...

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2013

Quantified Score

Hi-index 0.10

Visualization

Abstract

Visual saliency detection has been applied in many tasks in the fields of pattern recognition and computer vision, such as image segmentation, object recognition, and image retargeting. However, the accurate detection of saliency remains a challenge. The reasons behind this are that: (1) well-defined mechanism for saliency definition is rarely established; and (2) supporting information for detecting saliency is limited in general. In this paper, a multi-spectrum based saliency detection algorithm is proposed. Instead of only using the conventional RGB information as what existing algorithms do, this work incorporates near-infrared clues into the detection framework. Features of color and texture from both types of image modes are explored simultaneously. When calculating the color contrast, an effective color component analysis method is employed to produce more precise results. With respect to the texture analysis, texton representation is adopted for fast processing. Experiments are done to compare the proposed algorithm with other 11 state-of-the-art algorithms and the results indicate that our algorithm outperforms the others.