SideSight: multi-"touch" interaction around small devices
Proceedings of the 21st annual ACM symposium on User interface software and technology
HandSense: discriminating different ways of grasping and holding a tangible user interface
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Graspables: grasp-recognition as a user interface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hand grip pattern recognition for mobile user interfaces
IAAI'06 Proceedings of the 18th conference on Innovative applications of artificial intelligence - Volume 2
Grasp sensing for human-computer interaction
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iRotate: automatic screen rotation based on face orientation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch & activate: adding interactivity to existing objects using active acoustic sensing
Proceedings of the 26th annual ACM symposium on User interface software and technology
Exploring back-of-device interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures such as lying on one side, and manual rotation switches require explicit user input. iRotate Grasp automatically rotates screens of mobile devices to match users' viewing orientations based on how users are grasping the devices. Our insight is that users' grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype embeds a total of 32 light sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users' usage under 54 different conditions: 1) grasping the device using left, right, and both hands, 2) scrolling, zooming and typing, 3) in portrait, landscape-left, and landscape-right orientations, and while 4) sitting and lying down on one side. Results show that our grasp-based approach is promising, and our iRotate Grasp prototype could correctly rotate the screen 90.5% of the time when training and testing on different users.