Automatic event-based synchronization of multimodal data streams from wearable and ambient sensors

  • Authors:
  • David Bannach;Oliver Amft;Paul Lukowicz

  • Affiliations:
  • Embedded Systems Lab., University of Passau;Signal Processing Systems, TU Eindhoven;Embedded Systems Lab., University of Passau

  • Venue:
  • EuroSSC'09 Proceedings of the 4th European conference on Smart sensing and context
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A major challenge in using multi-modal, distributed sensor systems for activity recognition is to maintain a temporal synchronization between individually recorded data streams. A common approach is to use well defined 'synchronization actions' performed by the user to generate, easily identifiable pattern events in all recorded data streams. The events are then used to manually align data streams. This paper proposes an automatic method for this synchronization. We demonstrate that synchronization actions can be automatically identified and used for stream synchronization across widely different sensors such as acceleration, sound, force, and a motion tracking system. We describe fundamental properties and bounds of our event-based synchronization approach. In particular, we show that the event timing relation is transitive for sensor groups with shared members. We analyzed our synchronization approach in three studies. For a large dataset of 5 users and totally 308 data stream minutes we achieved a synchronization error of 0.3 s for more than 80% of the stream.