Real-time written-character recognition using MEMS motion sensors: Calibration and experimental results

  • Authors:
  • Zhuxin Dong;Uchechukwu C. Wejinya; Shengli Zhou; Qing Shan;Wen J. Li

  • Affiliations:
  • Department of Mechanical Engineering, University of Arkansas, Fayetteville, Arkansas 72701, USA;Department of Mechanical Engineering, University of Arkansas, Fayetteville, Arkansas 72701, USA;Centre for Micro and Nano Systems, Department of Mechanical&Automation Eng., The Chinese University of Hong Kong, HKSAR, China;Centre for Micro and Nano Systems, Department of Mechanical&Automation Eng., The Chinese University of Hong Kong, HKSAR, China;Centre for Micro and Nano Systems, Department of Mechanical&Automation Eng., The Chinese University of Hong Kong, HKSAR, China

  • Venue:
  • ROBIO '09 Proceedings of the 2008 IEEE International Conference on Robotics and Biomimetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Micro Inertial Measurement Unit (μIMU) based on Micro Electro Mechanical Systems (MEMS) sensor is applied to sense the motion information produced by human subjects. The μIMU is built with three-dimensional accelerometer. During our experiments, although we write the characters in a plane, all the three-dimensional acceleration information is taken from μIMU in processing data as the third dimension is very helpful to be applied in practical application. In our previous work, the effectiveness of different data processing methods including Fast Fourier Transform (FFT) and Discrete Cosine Transform (DCT) are compared, thus the latter, which is better, is adopted in this paper. Also, Hidden Markov Models (HMM) is introduced and used as a tool to realize hand gesture classification. With this method, new experimental results of hand-written recognition are obtained and stated in this paper. Five Arabic numbers, 0–4, are written forty times by two different persons and we utilize all the data as training samples. Then when another new 29 samples are input to the network for recognition, we obtain a high correct rate at 93%. Ultimately, this technology will provide the feasibility of character recognition and potential for humangesture recognition.