Multimodal MSEPF for visual tracking

  • Authors:
  • Masahiro Yokomichi;Yuki Nakagama

  • Affiliations:
  • Faculty of Engineering, University of Miyazaki, Miyazaki, Japan 889-2155;The Graduate School of Computer Science and Systems Engineering, University of Miyazaki, Miyazaki, Japan 889-2155

  • Venue:
  • Artificial Life and Robotics
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, particle filter has been applied to many visual tracking problems and it has been modified in order to reduce the computation time or memory usage. One of them is the Mean-Shift embedded particle filter (MSEPF, for short) and it is further modified as Randomized MSEPF. These methods can decrease the number of the particles without the loss of tracking accuracy. However, the accuracy may depend on the definition of the likelihood function (observation model) and of the prediction model. In this paper, the authors propose an extension of these models in order to increase the tracking accuracy. Furthermore, the expansion resetting method, which was proposed for mobile robot localization, and the changing the size of the window in Mean-Shift search are also selectively applied in order to treat the occlusion or rapid change of the movement.