iRotateGrasp: automatic screen rotation based on grasp of mobile devices

  • Authors:
  • Lung-Pan Cheng;Meng Han Lee;Che-Yang Wu;Fang-I Hsiao;Yen-Ting Liu;Hsiang-Sheng Liang;Yi-Ching Chiu;Ming-Sui Lee;Mike Y. Chen

  • Affiliations:
  • National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc;National Taiwan University, Taipei, Taiwan Roc

  • Venue:
  • CHI '13 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures such as lying on one side, and manual rotation switches require explicit user input. iRotateGrasp automatically rotates screens of mobile devices to match users' viewing orientations based on how users are grasping the devices. Our prototype used a total of 44 capacitive sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users' usage under 108 different combinations of posture, orientation, touchscreen operation, and left/right/both hands. Our offline analysis showed that our grasp-based approach is promising, with 80.9% accuracy when training and testing on different users, and up to 96.7% if users are willing to train the system.