Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Rock 'n' Scroll Is Here to Stay
IEEE Computer Graphics and Applications
HandSense: discriminating different ways of grasping and holding a tangible user interface
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Graspables: grasp-recognition as a user interface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hand grip pattern recognition for mobile user interfaces
IAAI'06 Proceedings of the 18th conference on Innovative applications of artificial intelligence - Volume 2
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iRotate: automatic screen rotation based on face orientation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones
Proceedings of the 25th annual ACM symposium on User interface software and technology
Hi-index | 0.01 |
Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures such as lying on one side, and manual rotation switches require explicit user input. iRotateGrasp automatically rotates screens of mobile devices to match users' viewing orientations based on how users are grasping the devices. Our insight is that users' grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype used a total of 44 capacitive sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users' usage under 108 different combinations of posture, orienta-tion, touchscreen operation, and left/right/both hands. Our offline analysis showed that our grasp-based approach is promising, with 80.9% accuracy when training and testing on different users, and up to 96.7% if users are willing to train the system. Our user study (N=16) showed that iRo-tateGrasp had an accuracy of 78.8% and was 31.3% more accurate than gravity-based rotation.