Integrating semantics into multimodal interaction patterns

  • Authors:
  • Ronnie Taib;Natalie Ruiz

  • Affiliations:
  • ATP Research Laboratory, National ICT Australia, Sydney, NSW, Australia and School of Computer Science and Engineering, The University of New South Wales, Sydney, NSW, Australia;ATP Research Laboratory, National ICT Australia, Sydney, NSW, Australia and School of Computer Science and Engineering, The University of New South Wales, Sydney, NSW, Australia

  • Venue:
  • MLMI'07 Proceedings of the 4th international conference on Machine learning for multimodal interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

A user experiment on multimodal interaction (speech, hand position and hand shapes) to study two major relationships: between the level of cognitive load experienced by users and the resulting multimodal interaction patterns; and how the semantics of the information being conveyed affected those patterns. We found that as cognitive load increases, users' multimodal productions tend to become semantically more complementary and less redundant across modalities. This validates cognitive load theory as a theoretical background for understanding the occurrence of particular kinds of multimodal productions. Moreover, results indicate a significant relationship between the temporal multimodal integration pattern (7 patterns in this experiment) and the semantics of the command being issued by the user (4 types of commands), shedding new light on previous research findings that assign a unique temporal integration pattern to any given subject regardless of the communication taking place.