An Implementation of Multi-Modal Game Interface Based on PDAs

  • Authors:
  • Kue-Bum Lee;Jung-Hyun Kim;Kwang-Seok Hong

  • Affiliations:
  • Sungkyunkwan University, Korea;Sungkyunkwan University, Korea;Sungkyunkwan University, Korea

  • Venue:
  • SERA '07 Proceedings of the 5th ACIS International Conference on Software Engineering Research, Management & Applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In computer animation and interactive computer games, gesture and speech modality can be a powerful interface between humans and computers. In this paper, we propose a Personal Digital Assistant (PDA)- based multi-modal network game interface using speech, gesture and touch sensations. To verify the validity of our approach, we implement a multi-modal omok game using TCP/IP on a PDA network. The experimental results using the proposed multi-modal network game resulted in an average recognition rate of 97.4%, and accordingly as the weaknesses of unimodality, such as incorrect command processing by recognition error, are offset by the strengths of other modalities, the user can enjoy a more interactive mobile game interface in any given environment.