Distributed multisensory signals acquisition and analysis in dyadic interactions

  • Authors:
  • Ashish Tawari;Cuong Tran;Anup Doshi;Thorsten Zander;Mohan Trivedi

  • Affiliations:
  • University of California, San Diego, La Jolla, California, USA;University of California, San Diego, La Jolla, California, USA;University of California, San Diego, La Jolla, California, USA;Max Planck Institute for Intelligent Systems, Tuebingen, Germany;University of California, San Diego, Cali, La Jolla, California, USA

  • Venue:
  • CHI '12 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human-machine interaction could be enhanced by providing information about the user's state, allowing for automated adaption of the system. Such context-aware system, however, should be able to deal with spontaneous and subtle user behavior. The artificial intelligence behind such systems, hence, also needs to deal with spontaneous behavior data for training as well as evaluation. Although harder to collect and annotate, spontaneous behavior data are preferable to posed as they are representative of real world behavior. Towards this end, we have designed a distributed testbed for multisensory signals acquisition while facilitating spontaneous interactions. We recorded audio-visual as well as physiological signals from 6 pairs of subjects while they were playing a bluffing dice game against each other. In this paper, we introduce the collected database and provide our preliminary results of bluff detection based on spatio-temporal face image signal analysis.