Selective motion analysis based on dynamic visual saliency map model

  • Authors:
  • Inwon Lee;Sang-Woo Ban;Kunihiko Fukushima;Minho Lee

  • Affiliations:
  • School of Electrical Engineering and Computer Science, Kyungpook National University, Taegu, Korea;Dept. of Information and Communication Engineering, Dongguk University, Gyeongbuk, Korea;Graduate School of Informatics, Kansai University, Osaka, Japan;School of Electrical Engineering and Computer Science, Kyungpook National University, Taegu, Korea

  • Venue:
  • ICAISC'06 Proceedings of the 8th international conference on Artificial Intelligence and Soft Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a biologically motivated motion analysis model using a dynamic bottom-up saliency map model and a neural network for motion analysis of which the input is an optical flow. The dynamic bottom-up saliency map model can generate a human-like visual scan path by considering dynamics of continuous input scenes as well as saliency of the primitive features of a static input scene. Neural network for motion analysis responds selectively to rotation, expansion, contraction and planar motion of the optical flow in a selected area. The experimental results show that the proposed model can generate effective motion analysis results for analyzing only an interesting area instead of considering the whole input scenes, which makes faster analysis mechanism for dynamic input scenes.