Poster abstract: human tracking based on LRF and wearable IMU data fusion

  • Authors:
  • Lin Wu;ZhuLin An;YongJun Xu;Li Cui

  • Affiliations:
  • Institute of Computing Technology, Chinese Academy of Sciences & University of Chinese Academy of Sciences, Beijing, China;Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China;Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China;Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • Proceedings of the 12th international conference on Information processing in sensor networks
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human tracking is one of the most important requirements for service mobile robots. Cameras and Laser Ranger Finders (LRFs) are usually used together for human tracking. But these kinds of solutions are too computationally expensive for most embedded processors on these robots as complex computer vision algorithms are needed to process large number of pixels. In this paper, we describe a method combining kinematic measurements from LRF mounted on the robot and Inertial Measurement Unit (IMU) carried by the target. These two types of sensors can calculate human's velocity and position independently, which are used as information for both indentifying and tracking the target. As pixels observed by LRF and IMU are 1D rather than 2D, our method requires much less computation and memory resources and can be implemented with low-performance embedded processors.