High level data fusion on a multimodal interactive application platform

  • Authors:
  • Hildeberto Mendonça

  • Affiliations:
  • Université catholique de Louvain, Louvain la Neuve, Belgium

  • Venue:
  • Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This research aims to propose a multimodal fusion framework for high-level data integration between two or more modalities. It takes as input extracted low level features from different system devices, analyzes and identifies intrinsic meanings in these data through dedicated processes running in parallel. Extracted meanings are mutually compared to identify complementarities, ambiguities and inconsistencies to better understand the user intention when interacting with the system. The whole fusion lifecycle will be described and evaluated in an ambient intelligence scenario, where two co-workers interact by voice and movements, demonstrating their intentions and the system gives advices according to identified needs.