The roles of sensory modalities in collaborative virtual environments (CVEs)

  • Authors:
  • Chang S. Nam;Joseph Shu;Donghun Chung

  • Affiliations:
  • Department of Industrial Engineering, University of Arkansas, Fayetteville, AR 72701, USA;Handshake VR Inc. Waterloo, ON N2L 5C6, Canada;School of Communication, Kwangwoon University, Seoul, 139-701, Korea

  • Venue:
  • Computers in Human Behavior
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This study was conducted to assess the effects of sensorial modalities on user performance, perception, and behavior in collaborative virtual environments (CVEs). Participants played a CVE game, air hockey, together with a remote partner under different sensory modality conditions, depending on the type of sensory feedback provided: visual-only (V), visual-haptic (V+H), and visual-haptic-audio feedback (V+H+A). Three types of measurements were used as dependent variables: (1) task performance measured as playing time, (2) user perception including the sense of presence, the sense of togetherness, and perceived collaboration, and (3) behavior measurement including the amount of force applied and the mallet deviation. Results of the study indicated that the task performance, perception, and user behavior in CVEs can be affected due to supported sensory modalities. Therefore, the multiple sensory information types that are required to perform the task at hand should be provided to effectively support collaboration between people in CVEs. The outcomes of this research should have a broad impact on multimodal user interaction, including research on physiological, psychophysical, and psychological mechanisms underlying human perception on multisensory feedback in CVEs.