Detecting pedestrian flocks by fusion of multi-modal sensors in mobile phones

  • Authors:
  • Mikkel Baun Kjærgaard;Martin Wirz;Daniel Roggen;Gerhard Tröster

  • Affiliations:
  • Wearable Computing Laboratory, ETH Zurich, Switzerland;Wearable Computing Laboratory, ETH Zurich, Switzerland;Wearable Computing Laboratory, ETH Zurich, Switzerland;Wearable Computing Laboratory, ETH Zurich, Switzerland

  • Venue:
  • Proceedings of the 2012 ACM Conference on Ubiquitous Computing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Previous work on the recognition of human movement patterns has mainly focused on movements of individuals. This paper addresses the joint identification of the indoor movement of multiple persons forming a cohesive whole - specifically a flock - with clustering approaches operating on features derived from multiple sensor modalities of modern smartphones. Automatic detection of flocks has several important applications, including evacuation management and socially aware computing. The novelty of this paper is, firstly, to use data fusion techniques to combine several sensor modalities (WiFi, accelerometer and compass) to improve recognition accuracy over previous unimodal approaches. Secondly, improve the recognition of flocks using hierarchical clustering. We use a dataset comprising 16 subjects forming one to four flocks walking in a building on single and multiple floors. With the best settings, we achieve a F-score accuracy of up to 87 percent an improvement of up to twelve percent points over existing approaches.