Motion-based perceptual user interface

  • Authors:
  • Xiubo Liang;Shun Zhang;Xiang Zhang;Weidong Geng

  • Affiliations:
  • State Key Lab. of CAD&CG, Zhejiang University, Hangzhou, China;State Key Lab. of CAD&CG, Zhejiang University, Hangzhou, China;State Key Lab. of CAD&CG, Zhejiang University, Hangzhou, China;State Key Lab. of CAD&CG, Zhejiang University, Hangzhou, China

  • Venue:
  • IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Perceptual user interface takes advantage of human perceptual capabilities in order to present semantical information in native and natural ways. In this paper, we present a novel approach to provide users with an accelerometer-based interface for interactively controlling not only functions or devices in digital environments but virtual characters in game-like scenarios. A general approach suitable for both PC and mobile platforms is proposed. Its core techniques include automatic generation and preprocessing of training samples and proper setup of machine learning models. Three sample applications are given: gesture-controlled lantern slide system with Wii Remote, gesture recognition system to make phone calls on Nokia N95, and performance-driven motion choreographing system with Xsens MTx. Experimental results show that the recognition rate is over 95% which is quite acceptable for gesture interaction systems.