A data-driven paradigm to understand multimodal communication in human-human and human-robot interaction

  • Authors:
  • Chen Yu;Thomas G. Smith;Shohei Hidaka;Matthias Scheutz;Linda B. Smith

  • Affiliations:
  • Psychological and Brain Scienecs and Cognitive Science Program, Indiana University, Bloomington, IN;Psychological and Brain Scienecs and Cognitive Science Program, Indiana University, Bloomington, IN;Psychological and Brain Scienecs and Cognitive Science Program, Indiana University, Bloomington, IN;Psychological and Brain Scienecs and Cognitive Science Program, Indiana University, Bloomington, IN;Psychological and Brain Scienecs and Cognitive Science Program, Indiana University, Bloomington, IN

  • Venue:
  • IDA'10 Proceedings of the 9th international conference on Advances in Intelligent Data Analysis
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Data-driven knowledge discovery is becoming a new trend in various scientific fields. In light of this, the goal of the present paper is to introduce a novel framework to study one interesting topic in cognitive and behavioral studies – multimodal communication between human-human and human-robot interaction. We present an overall solution from data capture, through data coding and validation, to data analysis and visualization. In data collection, we have developed a multimodal sensing system to gather fine-grained video, audio and human body movement data. In data analysis, we propose a hybrid solution based on visual data mining and information-theoretic measures. We suggest that this data-driven paradigm will lead not only to breakthroughs in understanding multimodal communication, but will also serve as a successful case study to demonstrate the promise of data-intensive discovery which can be applied in various research topics in cognitive and behavioral studies.