Multiscale discriminant saliency for visual attention

  • Authors:
  • Anh Cat Le Ngo;Kenneth Li-Minn Ang;Guoping Qiu;Jasmine Seng Kah-Phooi

  • Affiliations:
  • School of Engineering, The University of Nottingham, Malaysia;Centre for Communications Engineering Research, Edith Cowan University, Australia;School of Computer Science, The University of Nottingham, UK;Department of Computer Science & Networked System, Sunway University, Malaysia

  • Venue:
  • ICCSA'13 Proceedings of the 13th international conference on Computational Science and Its Applications - Volume 1
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The bottom-up saliency, an early stage of humans' visual attention, can be considered as a binary classification problem between center and surround classes. Discriminant power of features for the classification is measured as mutual information between features and two classes distribution. The estimated discrepancy of two feature classes very much depends on considered scale levels; then, multi-scale structure and discriminant power are integrated by employing discrete wavelet features and Hidden markov tree (HMT). With wavelet coefficients and Hidden Markov Tree parameters, quad-tree like label structures are constructed and utilized in maximum a posterior probability (MAP) of hidden class variables at corresponding dyadic sub-squares. Then, saliency value for each dyadic square at each scale level is computed with discriminant power principle and the MAP. Finally, across multiple scales is integrated the final saliency map by an information maximization rule. Both standard quantitative tools such as NSS, LCC, AUC and qualitative assessments are used for evaluating the proposed multiscale discriminant saliency method (MDIS) against the well-know information-based saliency method AIM on its Bruce Database wity eye-tracking data. Simulation results are presented and analyzed to verify the validity of MDIS as well as point out its disadvantages for further research direction.