An improvement in MSEPF for visual tracking

  • Authors:
  • Yuki Nakagama;Masahiro Yokomichi

  • Affiliations:
  • Graduate School of Computer Science and Systems Engineering, University of Miyazaki, Miyazaki, Japan;Department of Computer Science and Systems Engineering, University of Miyazaki, Miyazaki, Japan 889-2155

  • Venue:
  • Artificial Life and Robotics
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, many approaches to applying a particle filter to a visual tracking problem have been proposed. However, it is hard to implement such a filter in a real-time system because it requires a great deal of computation time and considerable resources to achieve a high accuracy. In order to overcome this difficulty, especially the computation time, Shan and other workers have proposed combining a particle filter and mean shift in order to maintain the accuracy with a small number of particles. In their approach, the state of each particle moves to the point in the window with the highest likelihood value. It is known that the accuracy of an estimation depends on the size of the window, but a larger window size makes the computation slower. In this article, we propose a method for exploring the highest likelihood more quickly by means of random sampling. Moreover, the likelihood is also modified in terms not only of color cues, but also of motion cues for a greater accuracy in object tracking. The effectiveness of the proposed method is evaluated by real image sequence experiments.