Multimodal smart interactive presentation system

  • Authors:
  • Hoang-An Le;Khoi-Nguyen C. Mac;Truong-An Pham;Vinh-Tiep Nguyen;Minh-Triet Tran

  • Affiliations:
  • Faculty of Information Technology, University of Science, VNU-HCMC, Vietnam,John von Neumann Institute, VNU-HCMC, Vietnam;Faculty of Information Technology, University of Science, VNU-HCMC, Vietnam,John von Neumann Institute, VNU-HCMC, Vietnam;Faculty of Information Technology, University of Science, VNU-HCMC, Vietnam;Faculty of Information Technology, University of Science, VNU-HCMC, Vietnam;Faculty of Information Technology, University of Science, VNU-HCMC, Vietnam

  • Venue:
  • HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The authors propose a system that allows presenters to control presentations in a natural way by their body gestures and vocal commands. Thus a presentation no longer follows strictly a rigid sequential structure but can be delivered in various flexible and content adapted scenarios. Our proposed system fuses three interaction modules: gesture recognition with Kinect 3D skeletal data, key concepts detection by context analysis from natural speech, and small-scaled hand gesture recognition with haptic data from smart phone sensors. Each module can process in realtime with the accuracy of 95.0%, 91.2%, and 90.1% respectively. The system uses events generated from the three modules to trigger pre-defined scenarios in a presentation to enhance the exciting experience for audiences.